Acceso abierto

Improvement of Eyeglass Lens Production Using Simulation Modeling and Dynamic Buffer Optimization


Cite

Introduction

Significantly increased competition in the global economy requires ever-faster delivery of highly personalized products at competitive costs. Planning and managing the production of a wide range of products raises organizational issues for the company. The nature of demand, the type of customers, as well as the frequency of delivery translate into the character of production, and thus the type of material flow in the production process and the associated type of control. In the classic approach, two models are recognized: make-to-stock (MTS) and make-to-order (MTO) models.

MTO represents an approach in which a high degree of product variety or customization is possible along with production lead times (Akinc and Jack, 2015). MTO is characterized by the fact that it is carried out only on customer order. Only after the customer's order is accepted, a production order can be generated and the product made.

The paper examines the problem of eyeglass lens manufacturing. Eyeglasses, especially eyeglass lenses, are highly personalized products. They are ordered by the optician in stores after the customer's eyesight test. Therefore, the manufacturing strategy used is called a make-to recipe, that is, it provides customized products to customers (Gyulai, et al., 2018). The main objective of MTO manufacturing is to treat each order received individually. In this environment, the flow of materials begins and ends at the stage of purchasing raw materials and semi-finished products necessary for manufacturing, and is triggered based on forecasts of their usage. The production process itself is initiated only after the order is received and its basic parameters, such as product features and delivery date, are identified. The basic feature of this environment is the strong influence of the customer on the final form of ordered products.

Highly personalized products present an extensive set of options covering most of their essential features (Da Silveira, et al., 2001). In the case of lenses, customers place an order for a product, according to the available catalog offer, which is made of standard components. The process is characterized by high variability in production. Similar products are produced, but with different parameters. The basic parameters describing corrective lenses are the power of the lens, the power and axis of the cylinder, the lens index, and the type of coating. The main optical characteristics of a lens that are customer dependent are already created by the first complex abrasive processes (Braunecker, et al., 2008).

The process of manufacturing eyeglass lenses begins when the prescription is received. Once the order is received and the product is released from the warehouse, the first stage is the mechanical processing of the semi-finished product. At this stage, the lens is formed to achieve the desired size and power. Only its back surface needs to be machined for the specific customer. The lens is ground and shaped, and then tens of thousands of machining points are determined using natural diamond. This method allows the manufacturer to personalize the optical surface, giving it any form. This stage is followed by an intermediate inspection and optional tinting of the lens. At the tinting stage, the lenses are immersed in tinting baths.

Curing is the process of applying a varnish to the lens, which then needs to be cured in an appropriate oven depending on the type of coating. Once the lenses have been cured, the process of coating follows. This process takes place in a special machine and varies depending on the index of the lenses to be evaporated and the type of coating selected. After the coating process, the lens is ground and assembled. The lenses are automatically trimmed to the shape of the customer's frame. After grinding, the lenses are mounted to the frame. The last stage is the final check and dispatch of the order to the customer.

To gain a competitive advantage in the eyeglass lens industry, a manufacturer must produce quality lenses according to customer expectations. In addition, the length of the manufacturing process becomes an added value of the services offered (Simbolon and Santoso, 2021). In addition to the low possibility of mass production, the complexity of the problem is increased by the customer's expectation of a contract delivery service – a short time from the moment of ordering to the production and delivery of products to the optical store (Gyulai, et al., 2018). The customers tend to desire shorter manufacturing and delivery times as the eyeglasses are very important to aid people with visual impairment (Simbolon and Santoso, 2021).

In recent years, the eyewear industry has also been cited as one that has begun to follow the mass customization paradigm (Gilmor and Pine, 1997; Barman and Canizares, 2015). Mass customization aims to provide customers with exactly what they want at low cost and high quality in ever-changing environments (Akinc and Jack, 2015). According to the most commonly cited definition (Suzić, et al., 2018), mass customization is the development, production, and marketing, and delivery of affordable goods and services in sufficient quantity and variety that almost everyone finds exactly what they want (Pine, 1993).

In a mass customization strategy, more complex and unpredictable situations arise at different stages of the production process due to greater interaction with consumers and external influences. Therefore, it becomes necessary to use different types of software and programs that are able to predict adverse effects in the production process (Behúnová, et al., 2017).

A way to study the behavior of complex systems is through quantitative modeling. Quantitative models often take the form of mathematical models. However, when mathematical models are used to study real-world systems, the many times increased complexity makes it difficult to reach an analytical solution. In these situations, simulation modeling can be helpful. Simulation software translates mathematical equations into less-abstract symbols (Malik, 2021).

Simulation modeling refers to the study of an existing or designed system by experimenting with a simulation system, creating a computer simulation model to simulate a dynamic reality system, and performing various experiments on the simulation model to evaluate and improve system performance (Banks, et al., 2010). With the development of technology, various simulation software have been developed, for example, FlexSim, AnyLogic, Tecnomatix Plant Simulation, Extendsim, and others. However, the emphasis and advantages of each simulation technology are different. FlexSim focuses on logistics industry simulation, AnyLogic prefers service industry logistics simulation, and Tecnomatix Plan Simulation focuses on discrete manufacturing systems (Qiao and Wang, 2021).

The eyeglass lens manufacturing technology and the machines and equipment available at some stages of the manufacturing process allow several to at least a dozen individual lenses to be processed simultaneously in a single technological operation (e.g. oven curing). Prior to the start of such processing operation, the lenses are collected in sufficiently numerous sets in buffers in front of the workstations. The capacity of the buffer usually depends on the capacity of the machine used at a given stage of the technological process. In a general case, it can be assumed that the filling of the buffer can be a decision variable within the implemented process. This leads to the so-called buffer allocation problem.

The buffer allocation problem is an NP-hard combinatorial optimization problem in production line design. The problem is to find the optimal buffer sizes to allocate to buffer areas on a production line to achieve a specific goal. This reduces machine idle time due to product shortage or blocking – no room to empty the machine. Shorter idle times increase average line throughput. Buffering also increases work-in-progress inventory. If buffers are too large, the capital cost incurred may outweigh the benefit of increased productivity. If buffers are too small, machines will be underutilized and demand will not be met. Buffers are established for three basic production resources: material inventory, lead time, and job resources.

In a production system, buffers can be associated directly with machines or sets of machines (Fig. 1). Because of the importance of finding good or optimal buffer configurations, proper problem formulation and solution can significantly improve the efficiency of the manufacturing process.

Figure 1

Production line with buffers (Source: Own elaboration)

The buffer allocation problem can be expressed mainly in three forms depending on the objective function. The functions can be concerned with maximizing the throughput of the production line, minimizing the total buffer size in the line, or minimizing the average inventory of work in progress (Demir, et al., 2014). For larger systems, approximation methods can be used. Simulation modeling (creating a model or digital twin of the system) is used to determine the throughput of the line, which allows for appropriate testing (Joubert and Kotze, 2020).

In recent years, many researchers have widely adopted metaheuristic methods to solve buffer allocation problems. One of the most popular metaheuristics approaches to solve this problem is genetic algorithms. Genetic algorithms are heuristic search algorithm based on evolutionary theories of natural selection and genetics (Calvino, et al., 2007).

The paper proposes a method of improving the examined production process using simulation modeling and optimization experiments. The scope of the method is to determine the buffer capacity maximizing the number of manufactured products in the planning period. From the point of view of obtaining the best possible values of the defined quality criterion, the execution of the simulation model alone does not provide a solution to the problem. Therefore, a genetic algorithm was used to perform optimization experiments. The obtained results made it possible to increase the number of manufactured products in the planning period without significantly increasing the use of company resources. Thanks to the applied approach, the improved results were obtained without conducting time-consuming research and repeating the experiments many times. The paper presents a discussion of the results and conclusions.

Method

Simulation methods allow testing various solutions in the studied environment based on real data. Simulation studies include several phases. The applied approach for conducting simulation studies is given by the German standard VDI 3633. The general phase-oriented approach can be summarized as problem definition, system analysis, model formalization, implementation, experimentation, and analysis. Data collection and preparation are done in a parallel phase (Voit, et al., 2020).

The methodology adopted in the paper generally follows the mentioned guidelines. To develop a method for solving the problem, the procedure proposed in the literature was used as a starting point (Law, 2015). Several changes were made to this procedure. The verification and validation of the simulation model were combined into one step. Due to the interdependence of these activities, validation should always be performed for a specific set of data describing the problem and its representation in the model. Any inconsistencies should be detected and resolved in a single cycle of activities.

Furthermore, it was assumed that the optimization experiments (determining the best buffer capacity) and simulation experiments (checking the effectiveness of the solution in the simulation model) would be performed iteratively until a satisfactory improvement in reducing the process execution time was achieved. Under the original procedure, these were performed sequentially. The order of preparation and execution of the optimization experiments was also changed, and the analysis stage and the results utilization stage were combined. Thus, the following method was constructed:

Formulation of the problem and the study plan (Section 3.1)

Data collection (Section 3.2)

Construction of the simulation model (Section 3.3)

Verification and validation of the simulation model (Section 3.4)

Searching for a solution(s) (Section 3.5)

Performing optimization experiments

Determination of optimization direction

Determination of the parameters of the optimization algorithm

Formulation and solution of the optimization task

Selecting solution/s for simulation verification

Conduct simulation experiments (run simulation model using solutions selected in 5a)

Checking whether the obtained solution can be considered satisfactory (if yes, proceed to step 6; if no, repeat steps 5a and 5b)

Discussion of the results (Section 3.6)

Formulating the problem and planning the study begins with identifying the overall objective or objectives of the study and the specific issues to be addressed. Criteria for evaluating the achievement of the objective should also be identified at this stage.

Data collection is a work-intensive activity and takes up a large portion of the total time of conducting simulation studies. It is necessary to begin data collection before or in the early stages of model development. Actual historical data can also be used to build the model.

When building a simulation model, a simulation method that fits the previously formulated problem and an appropriate program or tool should be selected. The assumptions for building the model as well as determining the level of detail of the model, and the attributes of the objects should also be indicated at this stage. When building the model, it is essential that in addition to the modelers, people who are very familiar with the operation of the actual system are involved.

Model verification and validation is the process of verifying that the operational logic adopted is correct. The main purpose of model verification is to ensure that the conceptual model is accurately represented in the simulation model representation. Checking for errors and eliminating them is something that should be done throughout the modeling process. In the validation stage, a comparison of the simulation results with the actual data is done.

Running simulations are used to test the output based on the change in input parameters. If the outputs are compared to the data from an existing system (or a hypothetical system, at the design stage), then the model can be validated. If, on the other hand, the results obtained are significantly different from the results of the real system under study, one should go back to the data collection stage and check their correctness.

For the optimization problem, the decision variables, the constraints of the task, as well as the criteria for evaluating the obtained solutions must be specified. Decisions must be made regarding conditions such as the initial state of the simulation and the length of the simulation.

The next step is to perform optimization experiments and run the simulation. The optimization algorithm is used to solve the optimization problem.

The simulation model, in turn, allows us to verify the fit of the solutions obtained by the genetic algorithm. This matching determines whether a given solution can be considered as satisfying the set assumptions in terms of the defined quality criteria. The process finishes after the assumed number of iterations of such procedure or finding a solution(s) that meet the preset assumptions. This is checked during the analysis of the output data by finding the answer to which variant(s) of the experiments meet the set requirements (Fig. 2).

Figure 2

Iterative process of the optimization (Source: Own elaboration)

The final step is to present the results obtained. The results can be used for decision-making and future production planning. All decisions made as a result of the simulation study can be implemented in the real system. Documentation also plays an important role, that is, reports showing the chronological sequence of work undertaken and changes made. This process facilitates modification of the simulation in the future.

The above method requires the selection of a suitable simulation program, which must be equipped with tools for efficient input data entry, organization of experiments, and presentation of the obtained results and optimization algorithm. There are many simulation programs available in the market. The choice of the program for the study depends on the preferences and capabilities of the decision-maker. It should be noted that the execution of the research requires very good knowledge of the program.

Results
Formulation of the problem and the study plan

The subject of the research in the paper is the process of lens production in one of the manufacturing companies operating in the EU market. According to business assumptions, the process should take 48 h from customer order to delivery. During the process, delays are encountered in different departments of the company, which results in exceeding the assumed time.

In this situation, the company's management decided to start one shift on Saturdays to compensate for the delays incurred during the week. At the same time, a systematic solution was sought to eliminate delays in individual departments of the company.

The study covered the lens curing part of the production process, the point in the process where delays primarily occur. The lens curing process follows the operations of mechanical processing of the semi-finished product and intermediate inspection. Once the lenses are transferred to the curing department and cleaning of the lens takes place, the varnish is applied to cure the lens.

The coating process, which takes place in a five-chamber machine, consists of washing in NaOH solution, rinsing in water, drying, coating, and pre-curing the spectacle lens. After these steps, quality control takes place at a separate station. The final step is to bake the lens in an oven to cure the applied varnish. A diagram of the curing process is shown in Fig. 3.

Figure 3

The process of curing eyeglass lenses (Source: Own elaboration)

Data collection

The company works in two shifts from Monday to Friday. One shift works on Saturday. There are two breaks of 15 and 30 min per shift. Two workers work in one shift. The first worker performs cleaning operations, and the second worker performs quality control operations. There are two five-chamber paint application machines in the department. A different operation takes place in each chamber. In addition to the coating machines, there are three more furnaces, each designed for a different material index of the lens.

For several weeks, the lens curing process was observed at each station during the working day and the individual elements that followed were recorded. A prepared sheet was used to record the individual steps in chronological order, the types of steps, their duration, and the number of units of product processed and completed. Additional information was obtained from observations and interviews.

The lenses under study are technologically divided into three types. Concerning the so-called material index, these are lenses: 1.5, 1.6, and 1.67. The production of lenses is a custom-made process with approximate repeatability. Based on historical data, the percentage distribution of manufactured spectacle lenses with different material indices was determined for conducting experiments. The lens inflow distributions are shown on a sample of single-week data (Table 1).

Percentage distribution of manufactured products (Source: Own elaboration)

Product Portion (%)
1.5 40
1.6 55
1.67 5

By taking a photograph of the workday, the times of the various operations were collected. Table 2 shows the names of the object that the individual operations represent, the processing times, and the number of lenses processed in the machine.

Processing times and machine capacities (Source: Own elaboration)

Station Processing time (min) Capacity (pcs)
A010 1:00 1
A020 10:00 12
A021 15:00 12
A022 15:00 12
A023 15:00 12
A024 15:00 12
A030 15:00 12
A031 15:00 12
A032 15:00 12
A033 15:00 12
A034 20:00 12
A040 6:00 12
A050 70:00 120
A060 80:00 120
A070 90:00 120

For the buffers located in front of the machines, their maximum capacity was determined. The processing time for the products in the buffers was assumed to be zero.

The capacities of the buffers are shown in Table 3. Data on energy consumption and equipment failure rates were not included in the model.

Buffer capacity (Source: Own elaboration)

Buffer Capacity (pcs)
B01–B04 12
B05–B07 120
Construction of the simulation model

Building a simulation model requires defining a set of machines having certain parameters. It is necessary to take into account the number of operations, duration of operations, break time, and working shifts. It has been assumed that the simulation modeling software Tecnomatix Plant Simulation will be used to carry out the problem. Thanks to the built-in tools, it will be possible to conduct experiments indicating which elements of the process should be improved.

The simulation model built in Tecnomatix consists of several objects and elements. The first of these is the Source object which simulates the source of materials entering the system. The workstations are represented by Station objects. The A010 object represents the cleaning station. ParallelStation objects represent the execution of operations on a batch of material, including the objects A021 – A024, A031 – A034 – five-chamber machines, A040 – intermediate control station, and – furnaces. Before the Station and Parallel-Station objects, there are buffers – Buffer-type objects. These are places for picking and grouping elements into production batches (objects B01, B02, B03, B04, B05, B06, B07). The Workerpool, Work-place, and Broker objects can be used to map employee activities. The ShiftCalendar object defines the division of working time into shifts and breaks. The EventController object is a system clock that determines the start and end of the simulation.

To fully reflect the real process, Methods were used to build the model. Methods are program code fragments written in SimTalk object-oriented language dedicated to Tecnomatix. Method object allows programming model control elements that are activated by other objects during simulation execution. This allows you to modify the operation of a particular object to fit your modeling needs exactly. It also allows you to read object attributes and change their values. Products resulting from the execution of the entire process flow into the Drain object. The simulation model is presented in Fig. 4. The model consists of objects representing individual machines, buffers, workers, and transport routes. The objects are linked by connectors indicating the direction of material flow. The data processing logic was implemented in methods written in the SimTalk programming language.

Figure 4

Simulation model of the curing process of spectacle lenses (Source: Own elaboration)

Verification and validation of the simulation model

The model was verified for possible errors. Simulations were performed for different input bottoms – different percentage distributions. The percentage distribution of lens inflow given in Table 1 was adopted, which repeats most frequently.

The simulations run confirmed that the model prepared reflects the production process analyzed. The input data were processed in the same way and at the same time as in the real production system. The results obtained in terms of the number of finished products manufactured per week were in line with the actual situation. A comparison of the results obtained with historical data (slightly different proportions of lenses produced with different material indices) indicated that the data were similar. The model was considered to be correctly implemented.

As a baseline, production per 6-day working week is 3360 lenses.

Searching for a solution(s)

The experiments aimed to maximize the number of products produced during the planning period, so that the number of products produced in 5 days was at least 3360 units. The buffers in the system under analysis were trays of lenses loaded into successive machines. The decision variables were the values of filling the trays with lenses, represented by the capacities of the buffers in front of the individual machines. Due to the capabilities of the program, it was assumed that the filling of a given type of lens tray could vary from machine to machine. To solve the optimization task, the genetic algorithm available in Tecnomatix implemented as GA Wizard tool was used.

The objective function in the analyzed process is to maximize the number of finished products at the output of the process.

To enter the decision variables into the program, a table was created indicating the consecutive decision variables (Table 4). Each buffer from B01 to B07 corresponded to one integer decision variable. For each decision variable, a lower and an upper limit were defined and it was enforced that they could only take even values. These assumptions were based on the characteristics of the production process. The capacity of the lens tray is 12 pieces and this is the upper limit for the varnish application machine and for performing quality control. For the three ovens, the upper limit is 120 pieces. The lenses are always processed in pairs, so two pieces will determine the permissible values of the decision variables (their evenness) and the lower limit because an empty tray is not loaded into the machine.

Constraints imposed on decision variables (Source: Own elaboration)

Buffer Lower bound Upper bound Step
B01–B04 2 12 2
B05–B07 2 120 2

In Tecnomatix Plan Simulation, the number of generations and the population size must be specified for the experiments to be carried out. The number of simulations performed that the genetic algorithm results from depends on the number of generations, the size of the generation (population), and the number of observations. Details can be found in the program documentation.

During the search for the best solution, the values of decision variables are changed – the characteristics of given objects are changed, and after their completion, the values of characteristics that represent the best solution according to the adopted optimization criteria are stored in the model. The experiments were differentiated in terms of the number of generations and the generation size for a single experiment. There were 25 experiments for the introduced variables.

The number of generations varied from 10 to 50; the generation size also varied from 10 to 50. In general, the higher the number of generations and its size, the more accurate and better results the algorithm provides. The results are summarized in Table 5 and shown in Fig. 5.

Experiments’ results (Source: Own elaboration)

Experiment Number of generation Size of generation Objective Buffer Capacity (decision variable) Running Time [sec]
B01 B02 B03 B04 B05 B06 B07
1 10 10 3856 2 6 4 8 86 46 48 62.3600
2 10 20 3872 4 2 2 6 46 76 18 132.3470
3 10 30 3858 4 2 4 4 32 52 10 192.9880
4 10 40 3878 2 4 2 8 42 52 96 252.0650
5 10 50 3870 2 4 2 8 52 56 26 330.5180
6 20 10 3868 2 6 4 8 36 76 48 126.0200
7 20 20 3884 4 2 2 6 46 76 24 270.6860
8 20 30 3882 4 2 2 4 30 52 10 395.2270
9 20 40 3878 2 4 2 8 42 52 96 525.5250
10 20 50 3880 2 4 2 8 52 56 16 656.1800
11 30 10 3868 2 6 4 8 36 76 48 199.4670
12 30 20 3884 4 2 2 6 46 76 24 409.8080
13 30 30 3884 4 2 2 4 30 52 24 604.6260
14 30 40 3882 4 4 6 6 38 52 96 781.3120
15 30 50 3884 2 4 2 8 52 52 16 937.6590
16 40 10 3878 2 4 4 8 38 76 48 276.7950
17 40 20 3884 4 2 2 6 46 76 24 549.0040
18 40 30 3884 4 2 2 4 30 52 24 794.0870
19 40 40 3882 4 4 6 6 38 52 96 1052.9440
20 40 50 3884 2 4 2 8 52 52 16 1287.4260
21 50 10 3878 2 4 4 8 38 76 48 347.1970
22 50 20 3884 4 2 2 6 46 76 24 693.6270
23 50 30 3884 4 2 2 4 30 52 24 980.5960
24 50 40 3884 4 4 6 6 40 52 96 1270.5940
25 50 50 3884 2 4 2 8 52 52 16 1538.1840

Figure 5

Experiments’ results (Source: Own elaboration)

The optimization experiments characterized by the best value of the objective function (3384) were selected for simulation. It was observed that the configuration of the buffer fill proposal was repeated.

The optimization experiments conducted for the same generation size had the same configuration of buffer filling capacity (Table 6).

Choice of configuration for the best result (Source: Own elaboration)

Experiment B01 B02 B03 B04 B05 B06 B07 Objective
7 4 2 2 6 46 76 24 3884
12 4 2 2 6 46 76 24 3884
13 4 2 2 4 30 52 24 3884
15 2 4 2 8 52 52 16 3884
17 4 2 2 6 46 76 24 3884
18 4 2 2 4 30 52 24 3884
20 2 4 2 8 52 52 16 3884
22 4 2 2 6 46 76 24 3884
23 4 2 2 4 30 52 24 3884
24 4 4 6 6 40 52 96 3884
25 2 4 2 8 52 52 16 3884

Given this fact, the number of buffer configurations can be reduced to four variants (Table 7).

Reduction of configuration for the best result (Source: Own elaboration)

Variant B01 B02 B03 B04 B05 B06 B07 Objective
1 4 2 2 6 46 76 24 3884
2 4 2 2 4 30 52 24 3884
3 2 4 2 8 52 52 16 3884
4 4 4 6 6 40 52 96 3884

For each variant, it was checked whether, using the given buffer settings, it would be possible to dispense with the Saturday working shift. The simulation duration was reduced from six to five working days and simulation experiments were carried out. The results are shown in Table 8.

Number of products manufactured during the assumed period (Source: Own elaboration)

Variant B01 B02 B03 B04 B05 B06 B07 Objective
1 4 2 2 6 46 76 24 3448
2 4 2 2 4 30 52 24 3502
3 2 4 2 8 52 52 16 3504
4 4 4 6 6 40 52 96 3420

Simulation experiments for each of the variants after configuration reduction confirmed the acceptability of the solution and the ability to increase the number of products produced in 5 days beyond the baseline – production in six working days of 3360 lenses. Thus, each of the solutions meets the objective set before the experiments. For the case studied, it was not necessary to look for other ways to improve the production process and redo the optimization experiments.

Discussion of the results

Optimization experiments were conducted for a 6-day planning period, and then the best solutions were checked with simulation experiments for a 5-day planning period. This was based on the assumption that the best solution obtained would not necessarily lead to elimination of the 6-day planning period.

From the point of view of the production process, the best solutions obtained must be considered satisfactory. In the optimization experiments, for a 6-day planning period, the total production volume varies from 3856 to 3884 units, while maintaining the assumed proportions between lenses with different material indices. The best result differs from the worst by approximately 0.7%. Compared to the baseline, the worst of the best solutions was better by about 14.8%. After the simulation experiments for the 5-day planning period, the difference between the best and worst solutions was about 2.4%. Compared to the baseline, the worst of the best solutions was better by about 1.8%. This means that after shortening the planning period by one shift (by about 9.1%), the achieved improvement compared to the baseline decreased in absolute numbers by about 13% (14.8%–1.8%). Fig. 6 shows the percentage increase in production for the four best solutions.

Figure 6

Percentage increase of the number of manufactured products as optimization results (Source: Own elaboration)

The analysis of the results leads to the conclusion that in the general case, it may be expedient to carry out optimization experiments also for a 5-day planning period.

The time to obtain a solution varies from 62 to 1538 sec. In general, the execution time of the experiments increases with an increasing number of generations and generation size. The proposed method was used as a tool to improve the production process. The experiments were conducted on historical data to see if using buffer fills other than maximum will improve the production volume. Therefore, the execution time of the experiments can be considered acceptable.

Fig. 7 shows the best, average, and worst efficiency values of each generation, for the first best solution obtained. Up to the sixth generation, with the planned number of 20 generations and a generation size also equal to 20, a significant improvement in the efficiency of the obtained solutions is visible.

Figure 7

Efficiency graph for number of generations 20 and generation size 20 (Source: Own elaboration)

From the 10th generation onward, the values improve only slightly. This means that the genetic algorithm implemented in Tecnomatix is efficient enough for the problem under investigation – a relatively small number of generations and generation size is sufficient.

In the different solutions, the filling of buffer B1 is 2 or 4 units, buffers B02–B04 change from 2 to 8 units, B05 from 30 to 86 units, B06 from 46 to 76 units, and buffer B07 from 10 to 76 units, with their maximum capacities of 12 (B01–B04) and 120 units (B05–B07), respectively. The distribution of buffer fills is not uniform between the different solutions. The buffer fill levels in the obtained solutions are less than the maximum values from more than 60% to more than 90%. This means that it is necessary to start up the machines much more frequently during one working shift. This will result in higher electricity consumption, more wear and tear on the machines, higher machine running costs, and a higher workload for the department staff. These elements should be investigated and their costs estimated before deciding to implement the improvement.

Overall, the solution obtained should be assessed as satisfactory and possible to implement in the company.

Discussion

The method proposed in the paper for solving the problem takes advantage of the latest simulation tools and data processing capabilities offered by modern computers. It is expected that this trend will continue with the integration of even more data and better animation of models. The first tools already offer virtual reality capabilities.

As shown in the paper, manufacturing processes are becoming more complex as products become more complex, among other things to meet the needs of mass customization. Simulation modeling is carried out to analyze such complex processes to develop and test new concepts, systems, or resources that meet the expectations of modern manufacturing. This is the form of the problem-solving method proposed in the paper. The method requires further analysis in terms of establishing and extending the parameters of its invocation, including the planning period, the consumption of resources (energy and machinery), the parameterization of the optimization algorithm used, and the verification of the test procedure. Alternatively, it may be conducted in pairs – an optimization experiment and a simulation experiment.

Current research indicates that different approaches can be used to optimize the buffer allocation problem. Due to the success of metaheuristics such as genetic algorithm, ant algorithm, simulated annealing, or tabu search in solving optimization problems, the integration of metaheuristics with simulation is becoming a trend. It is characteristic for metaheuristic approaches that they always reach feasible solutions, but do not guarantee their optimality. However, as shown in the paper, it is possible to obtain satisfactory results from the point of view of the efficiency of implemented processes with a relatively small amount of computation.

It can be concluded that the vast majority of researchers have analyzed the buffer allocation problem to maximize process performance or minimize the total buffer size. This paper shows that the ultimate goal can be to improve process performance when searching for the best (optimal) buffer capacity. It should be noted that there is usually more than one objective when trying to optimize process performance. This approach requires the use of multi-criteria optimization, which seems to be an important research issue in this area and sets an additional direction for further work.

Conclusions

The problem posed in the paper can be considered as investigated and solved adequately to the set objectives and the indicated scope of research. A method for solving the problem has been proposed, the research method has been specified, the quality criterion for evaluating the results has been defined, and simulation experiments have been performed. The best solutions have been indicated and the possibility of their practical use has been analyzed. The most important conclusion indicates that the simulation experiment method, taking into account the original approach to the buffer allocation problem, can effectively support the management and improvement of the efficiency of implementation of the studied class of production processes. In case of similar problems, the analysis of organizational and technological factors should be a good starting point for process optimization.

In the case of the analyzed process, the simulation results indicate a great potential for improvement in this area of the enterprise's activity, the measure of which is the volume of lens production in the considered period. The use of simulation tools also significantly saves time in finding a solution to the problem – using simulation, it is possible to observe in a few minutes the work of a system, which in reality takes many hours. Simulation modeling should be used when it is difficult or inadvisable to use the analytical method. The above conclusions are general and can be directly applied to modeling and solving manufacturing problems with similar characteristics and complexity.

Further research should confront the results obtained in this work with the results that can be obtained using other ways of modeling the problem and other optimization algorithms. The direction of research should also include modeling and experimental verification of other elements of the production process. The relationship between customized production and mass personalization of products (in the sense of creating highly personalized products for specific audiences based on a set of criteria) should also be analyzed. The combination of the above directions of research should make it possible to solve the analyzed class of problems coherently and comprehensively.