Nutrient pollution such as nitrate (NO3−) can cause water quality degradation in rivers used as a source of drinking water. This situation raises the question of how the nutrients have moved depending on many factors such as land use and anthropogenic sources. Researchers developed several nutrient export coefficient models depending on the aforementioned factors. To this purpose, statistical data including a number of factors such as historical water quality and land use data for the Melen Watershed were used. Nitrate export coefficients are estimates of the total load or mass of nitrate (NO3−) exported from a watershed standardized to unit area and unit time (e.g. kg/km2/day). In this study, nitrate export coefficients for the Melen Watershed were determined using the model that covers the Frequentist and Bayesian approaches. River retention coefficient was determined and introduced into the model as an important variable.
The paper presents preliminary results of investigations on a relationship between turbidity and other quality parameters in the SBR plant effluent. The laboratory tests demonstrated a high correlation between an effluent turbidity and a total suspended solids (TSS) concentration as well as between TSS and COD. Such a relationship would help to continuously monitor and control quality of a wastewater discharge using turbidity measurement.
This work presents a comparative analysis of the phenolic composition (UHPLC-PDA-ESI-MS3, HPLC-PDAfingerprint, UV-spectrophotometric methods) and antioxidant activity (DPPH, FRAP) of leaf samples from two vegetation seasons of a medicinal and dietary plant Sorbus domestica growing in its natural habitat (Croatia, C) and cultivated in Poland (P). The samples from both sources were rich in structurally diverse polyphenols (44 analytes; P: 73.4–76.6 and C: 98.3–106.7 mg GAE/g dry leaves) including the dominating flavan-3-ols and flavonoids. The greatest qualitative and quantitative differences were observed for flavonoids (P: 14.3–20.3%; C: 27.5–34.1% of polyphenols) – in the Polish samples flavonoid diglycosides predominated, in the Croatian samples the contents of both monoglycosides and diglycosides were similar. In the case of dry methanolic extracts, despite the higher extraction efficiency obtained for the Croatian samples (32–36% vs 23–24%), the quality of the extracts was comparable, both in terms of the total phenolic content (P: 269.4–280.0; C: 297.6–304.4; mg GAE/g dry extract) and antioxidant activity parameters (DPPH, EC50, μg/mL. P: 10.5–10.9, C: 10.0–10.3; and FRAP, mmol Fe2+/g, P: 6.64–7.13, C: 7.06–7.11). As a result, the study confirmed the influence of environmental conditions on the phenolic profile and antioxidant capacity of S. domestica leaves, as well as showed that despite some differences, plant materials from both Poland and Croatia might be suitable for production of natural health products.
In this study, the concepts of simultaneous user association and resource allocation in non-orthogonal multiple access systems have been investigated. Subscribers are randomly distributed in them. In the paper, a novel cooperative energy harvesting model is introduced so that user equipment near to the base stations acts as relay for further subscribers. In order to consider the local limitations of alternative energy resources, it was assumed that alternative energy would be shared among the base stations by means of the dynamic grid network. In this architecture, non-orthogonal resource allocation and user association frameworks should be reconfigured because conventional schemes use orthogonal multiple access. Hence, this paper suggests a novel approach to joint optimum cooperative power allocation and user association techniques to achieve a maximum degree of energy efficiency for the whole system in which the quality of experience parameters are assumed to be bounded during multi-cell multicast sessions. The model was also modified to develop joint multi-layered resource control and user association that can distinguish the service pattern in cooperative energy heterogeneous systems with non-orthogonal multiple access to obtain more resource optimality than in the current approaches. The effectiveness of the suggested approach is confirmed by numerical results. Also, the results reveal that non-orthogonal multiple access can provide greater energy efficiency than the conventional orthogonal multiple access approaches such as e.g. the MAX-SINR scheme.
The article discusses the issues of values and social responsibility of universities. On the one hand, the foundations of functioning of universities, which are created by research and education and the role of universities in formation, are recalled. On the other hand, it was reminded that the heart of universities, their DNA, are academic values, defined primarily in the Magna Charta Universitatum, but also in many other documents, such as the Code of Values of the Jagiellonian University. Hence, universities are increasingly often referred to not only as universities of knowledge, but also as universities of wisdom. Together, they are the basis for the social responsibility of universities. However, they alone are not enough for this social responsibility to materialise. Appropriate behaviour and actions are essential. Because knowledge alone is not everything. Such actions are always necessary, but especially when we find ourselves, as a country, humanity and a planet, in a crisis situation related to the climate disaster, which we are already partially experiencing. After the presentation of the most important current facts related to the climate and environmental crisis, the tasks to be undertaken urgently in this context by universities were presented, from broadly understood education, through convincing politicians to ambitious and quick actions, to intensive work on innovative solutions that can contribute to reducing threats brought by the climate and environmental crisis, pointing out, among others, the initiatives proposed by the newly created network of universities U7.
Based on the publications regarding new or recent measurement systems for the tokamak plasma experiments, it can be found that the monitoring and quality validation of input signals for the computation stage is done in different, often simple, ways. In the paper is described the unique approach to implement the novel evaluation and data quality monitoring (EDQM) model for use in various measurement systems. The adaptation of the model is made for the GEM-based soft X-ray measurement system FPGA-based. The EDQM elements has been connected to the base firmware using PCI-E DMA real-time data streaming with minimal modification. As additional storage, on-board DDR3 memory has been used. Description of implemented elements is provided, along with designed data processing tools and advanced simulation environment based on Questa software.
The new legislative provisions, regulating the solid fuel trade in Poland, and the resolutions of provincial assemblies assume, inter alia, a ban on the household use of lignite fuels and solid fuels produced with its use; this also applies to coal sludge, coal flotation concentrates, and mixtures produced with their use. These changes will force the producers of these materials to find new ways and methods of their development, including their modification (mixing with other products or waste) in order to increase their attractiveness for the commercial power industry. The presented paper focuses on the analysis of coal sludge, classified as waste (codes 01 04 12 and 01 04 81) or as a by-product in the production of coals of different types. A preliminary analysis aimed at presenting changes in quality parameters and based on the mixtures of hard coal sludge (PG SILESIA) with coal dusts from lignite (pulverized lignite) (LEAG) has been carried out. The analysis of quality parameters of the discussed mixtures included the determination of the calorific value, ash content, volatile matter content, moisture content, heavy metal content (Cd, Tl, Hg, Sb, As, Pb, Cr, Co, Cu, Mn, Ni, and W), and sulfur content. The preliminary analysis has shown that mixing coal sludge with coal dust from lignite and their granulation allows a product with the desired quality and physical parameters to be obtained, which is attractive to the commercial power industry. Compared to coal sludge, granulates made of coal sludge and coal dust from lignite with or without ground dolomite have a higher sulfur content (in the range of 1–1.4%). However, this is still an acceptable content for solid fuels in the commercial power industry. Compared to the basic coal sludge sample, the observed increase in the content of individual toxic components in the mixture samples is small and it therefore can be concluded that the addition of coal dust from lignite or carbonates has no significant effect on the total content of the individual elements. The calorific value is a key parameter determining the usefulness in the power industry. The size of this parameter for coal sludge in an as received basis is in the range of 9.4–10.6 MJ/kg. In the case of the examined mixtures of coal sludge with coal dust from lignite, the calorific value significantly increases to the range of 14.0–14.5 MJ/kg (as received). The obtained values increase the usefulness in the commercial power industry while, at the same time, the requirements for the combustion of solid fuels are met to a greater extent. A slight decrease in the calorific value is observed in the case of granulation with the addition of CaO or carbonates. Taking the analyzed parameters into account, it can be concluded that the prepared mixtures can be used in the combustion in units with flue gas desulfurization plants and a nominal thermal power not less than 1 MW. At this stage of work no cost analysis was carried out.
The paper presents the results of experimental validation of a set of innovative software services supporting processes of achieving, assessing and maintaining conformance with standards and regulations. The study involved several hospitals implementing the Accreditation Standard promoted by the Polish Ministry of Health. First we introduce NOR-STA services that implement the TRUST-IT methodology of argument management. Then we describe and justify a set of metrics aiming at assessment of the effectiveness and efficiency of the services. Next we present values of the metrics that were built from the data collected. The paper concludes with giving the interpretation and discussing the results of the measurements with respect to the objectives of the validation experiment.
The paper presents an application of advanced data-driven (soft) models in finding the most probable particular causes of missed ductile iron melts. The proposed methodology was tested using real foundry data set containing 1020 records with contents of 9 chemical elements in the iron as the process input variables and the ductile iron grade as the output. This dependent variable was of discrete (nominal) type with four possible values: ‘400/18’, ‘500/07’, ‘500/07 special’ and ‘non-classified’, i.e. the missed melt. Several types of classification models were built and tested: MLP-type Artificial Neural Network, Support Vector Machine and two versions of Classification Trees. The best accuracy of predictions was achieved by one of the Classification Tree model, which was then used in the simulations leading to conversion of the missed melts to the expected grades. Two strategies of changing the input values (chemical composition) were tried: content of a single element at a time and simultaneous changes of a selected pair of elements. It was found that in the vast majority of the missed melts the changes of single elements concentrations have led to the change from the non-classified iron to its expected grade. In the case of the three remaining melts the simultaneous changes of pairs of the elements’ concentrations appeared to be successful and that those cases were in agreement with foundry staff expertise. It is concluded that utilizing an advanced data-driven process model can significantly facilitate diagnosis of defective products and out-of-control foundry processes.
Statistical Process Control (SPC) based on the Shewhart’s type control charts, is widely used in contemporary manufacturing industry, including many foundries. The main steps include process monitoring, detection the out-of-control signals, identification and removal of their causes. Finding the root causes of the process faults is often a difficult task and can be supported by various tools, including datadriven mathematical models. In the present paper a novel approach to statistical control of ductile iron melting process is proposed. It is aimed at development of methodologies suitable for effective finding the causes of the out-of-control signals in the process outputs, defined as ultimate tensile strength (Rm) and elongation (A5), based mainly on chemical composition of the alloy. The methodologies are tested and presented using several real foundry data sets. First, correlations between standard abnormal output patterns (i.e. out-of-control signals) and corresponding inputs patterns are found, basing on the detection of similar patterns and similar shapes of the run charts of the chemical elements contents. It was found that in a significant number of cases there was no clear indication of the correlation, which can be attributed either to the complex, simultaneous action of several chemical elements or to the causes related to other process variables, including melting, inoculation, spheroidization and pouring parameters as well as the human errors. A conception of the methodology based on simulation of the process using advanced input - output regression modelling is presented. The preliminary tests have showed that it can be a useful tool in the process control and is worth further development. The results obtained in the present study may not only be applied to the ductile iron process but they can be also utilized in statistical quality control of a wide range of different discrete processes.
The paper deals with problem of optimal used automatic workplace for HPDC technology - mainly from aspects of operations sequence, efficiency of work cycle and planning of using and servicing of HPDC casting machine. Presented are possible ways to analyse automatic units for HPDC. The experimental part was focused on the rationalization of the current work cycle time for die casting of aluminium alloy. The working place was described in detail in the project. The measurements were carried out in detail with the help of charts and graphs mapped cycle of casting workplace. Other parameters and settings have been identified. The proposals for improvements were made after the first measurements and these improvements were subsequently verified. The main actions were mainly software modifications of casting center. It is for the reason that today's sophisticated workplaces have the option of a relatively wide range of modifications without any physical harm to machines themselves. It is possible to change settings or unlock some unsatisfactory parameters.
Statistical Process Control (SPC) based on the well known Shewhart control charts, is widely used in contemporary manufacturing industry, including many foundries. However, the classic SPC methods require that the measured quantities, e.g. process or product parameters, are not auto-correlated, i.e. their current values do not depend on the preceding ones. For the processes which do not obey this assumption the Special Cause Control (SCC) charts were proposed, utilizing the residual data obtained from the time-series analysis. In the present paper the results of application of SCC charts to a green sand processing system are presented. The tests, made on real industrial data collected in a big iron foundry, were aimed at the comparison of occurrences of out-of-control signals detected in the original data with those appeared in the residual data. It was found that application of the SCC charts reduces numbers of the signals in almost all cases It is concluded that it can be helpful in avoiding false signals, i.e. resulting from predictable factors.
The purpose of this paper was testing suitability of the time-series analysis for quality control of the continuous steel casting process in production conditions. The analysis was carried out on industrial data collected in one of Polish steel plants. The production data concerned defective fractions of billets obtained in the process. The procedure of the industrial data preparation is presented. The computations for the time-series analysis were carried out in two ways, both using the authors’ own software. The first one, applied to the real numbers type of the data has a wide range of capabilities, including not only prediction of the future values but also detection of important periodicity in data. In the second approach the data were assumed in a binary (categorical) form, i.e. the every heat(melt) was labeled as ‘Good’ or ‘Defective’. The naïve Bayesian classifier was used for predicting the successive values. The most interesting results of the analysis include good prediction accuracies obtained by both methodologies, the crucial influence of the last preceding point on the predicted result for the real data time-series analysis as well as obtaining an information about the type of misclassification for binary data. The possibility of prediction of the future values can be used by engineering or operational staff with an expert knowledge to decrease fraction of defective products by taking appropriate action when the forthcoming period is identified as critical.
The aim of the paper was an attempt at applying the time-series analysis to the control of the melting process of grey cast iron in production conditions. The production data were collected in one of Polish foundries in the form of spectrometer printouts. The quality of the alloy was controlled by its chemical composition in about 0.5 hour time intervals. The procedure of preparation of the industrial data is presented, including OCR-based method of transformation to the electronic numerical format as well as generation of records related to particular weekdays. The computations for time-series analysis were made using the author’s own software having a wide range of capabilities, including detection of important periodicity in data as well as regression modeling of the residual data, i.e. the values obtained after subtraction of general trend, trend of variability amplitude and the periodical component. The most interesting results of the analysis include: significant 2-measurements periodicity of percentages of all components, significance 7-day periodicity of silicon content measured at the end of a day and the relatively good prediction accuracy obtained without modeling of residual data for various types of expected values. Some practical conclusions have been formulated, related to possible improvements in the melting process control procedures as well as more general tips concerning applications of time-series analysis in foundry production.
In renewable systems, there may be conditions that can be either network error or power transmission line and environmental conditions such as when the wind speed is unbalanced and the wind turbine is connected to the grid. In this case, the control system is not damaged and will remain stable in the power transmission system. Voltage stability studies on an independent wind turbine at fault time and after fixing the error is one of the topics that can strengthen the future of independent collections. At the time of the fault, the network current increases dramatically, resulting in a higher voltage drop. Hence the talk of fast voltage recovery during error and after fixing the error and protection of rotor and grid side converters against the fault current and also protection against rising DC voltage (which sharply increases during error) is highly regarded. So, several improvements have been made to the construction of a doubly-fed induction generator (DFIG) turbine such as: a) error detection system, b) DC link protection, c) crow bar circuit, d) block of the rotor and stator side converters, e) injecting reactive power during error, f) nonlinear control design for turbine blades, g) tuning and harmonization of controllers used to keep up the power quality and to stabilize the system output voltage in the power grid. First, the dynamic models of a wind turbine, gearbox, and DFIG are presented. Then the controllers are modeled. The results of the simulation have been validated in MATLAB/Simulink.
The one-dimension frequency analysis based on DFT (Discrete FT) is sufficient in many cases in detecting power disturbances and evaluating power quality (PQ). To illustrate in a more comprehensive manner the character of the signal, time-frequency analyses are performed. The most common known time-frequency representations (TFR) are spectrogram (SPEC) and Gabor Transform (GT). However, the method has a relatively low time-frequency resolution. The other TFR: Discreet Dyadic Wavelet Transform (DDWT), Smoothed Pseudo Wigner-Ville Distribution (SPWVD) and new Gabor-Wigner Transform (GWT) are described in the paper. The main features of the transforms, on the basis of testing signals, are presented.
The paper presents a practical example of improving quality and occupational safety on automated casting lines. Working conditions on the line of box moulding with horizontal mould split were analysed due to low degree of automation at the stage of cores or filters installation as well as spheroidizing mortar dosing. A simulation analysis was carried out, which was related to the grounds of introducing an automatic mortar dispenser to the mould. To carry out the research, a simulation model of a line in universal Arena software for modelling and simulation of manufacturing systems by Rockwell Software Inc. was created. A simulation experiment was carried out on a model in order to determine basic parameters of the working system. Organization and working conditions in other sections of the line were also analysed, paying particular attention to quality, ergonomics and occupational safety. Ergonomics analysis was carried out on manual cores installation workplace and filters installation workplace, and changes to these workplaces were suggested in order to eliminate actions being unnecessary and onerous for employees.
The MDCT and IntMDCT Algorithm is widely utilized is Audio coding. By lifting scheme or rounding operation IntegerMDCT is evolved from Modified Discrete Cosine Transform. This method acquire the properties of MDCT and contribute excelling invertiblity and good spectral mean .In this paper we discuss about the audio codec like AAC and FLAC using MDCT and Integer MDCT algorithm and to find which algorithm shows better Compression Ratio(CR).The confines of this task is to hybriding lossy and lossless audio codec with diminished bit rate but with finer sound quality. Certainly the quality of the audio is figure out by Subjective and Objective testing which is in terms of MOS (Mean opinion square), ABx and some of the hearing aid testing methodology like PEAQ(Perceptual Evaluation Audio Quality) and ODG(Objective Difference Grade)is followed. Execution measure, that is Compression Ratio(CR) and Sound Pressure Level (SPL) is approximated.
The use of quantitative methods, including stochastic and exploratory techniques in environmental studies does not seem to be sufficient in practical aspects. There is no comprehensive analytical system dedicated to this issue, as well as research regarding this subject. The aim of this study is to present the Eco Data Miner system, its idea, construction and implementation possibility to the existing environmental information systems. The methodological emphasis was placed on the one-dimensional data quality assessment issue in terms of using the proposed QAAH1 method - using harmonic model and robust estimators beside the classical tests of outlier values with their iterative expansions. The results received demonstrate both the complementarity of proposed classical methods solution as well as the fact that they allow for extending the range of applications significantly. The practical usefulness is also highly significant due to the high effectiveness and numerical efficiency as well as simplicity of using this new tool.