[workshop scheduling] solving the optimal problem of hybrid flow shop scheduling based on Matlab genetic algorithm [including Matlab source code phase 901]

Matlab scientific research 2021-08-10 09:15:16 阅读数:650

本文一共[544]字,预计阅读时长:1分钟~
workshop scheduling solving optimal problem

One 、 brief introduction

1 Overview of genetic algorithm
Genetic algorithm (ga) (Genetic Algorithm,GA) It's part of evolutionary computing , It is a computational model that simulates the biological evolution process of Darwin's genetic selection and natural elimination , It is a method to search the optimal solution by simulating the natural evolution process . The algorithm is simple 、 Universal , Strong robustness , Suitable for parallel processing .

2 Characteristics and application of genetic algorithm
Genetic algorithm is a kind of robust search algorithm which can be used for complex system optimization , Compared with the traditional optimization algorithm , It has the following characteristics :
(1) Take the code of decision variable as the operation object . Traditional optimization algorithms often directly use the actual value of decision variables to optimize calculation , But genetic algorithm uses some form of coding of decision variables as the operation object . This coding method for decision variables , So we can learn from the concepts of chromosome and gene in Biology , It can imitate the genetic and evolutionary incentives of organisms in nature , Genetic operators can also be easily applied .
(2) Directly use fitness as search information . The traditional optimization algorithm not only needs to use the value of the objective function , Moreover, the search process is often constrained by the continuity of the objective function , There may be a need to meet “ The derivative of the objective function must exist ” To determine the search direction . The genetic algorithm only uses the fitness function value transformed by the objective function value to determine the further search range , No other auxiliary information such as derivative value of objective function is required . Directly using the objective function value or individual fitness value can also focus the search range into the search space with higher fitness , To improve search efficiency .
(3) Search information using multiple points , With implicit parallelism . The traditional optimization algorithm is often an iterative search process starting from an initial point in the solution space . A single point provides little search information , So the search efficiency is not high , It is also possible to fall into the local optimal solution and stop ; Genetic algorithm starts the search process of the optimal solution from the initial population composed of many individuals , Instead of searching from a single individual . Of the initial population 、 choice 、 cross 、 Mutation and other operations , Produce a new generation of groups , It includes a lot of group information . This information can avoid searching some unnecessary points , So as to avoid falling into local optimization , Gradually approach the global optimal solution .
(4) Use probabilistic search instead of deterministic rules . Traditional optimization algorithms often use deterministic search methods , The transfer from one search point to another has a definite transfer direction and transfer relationship , This certainty may make the search less than optimal , It limits the application scope of the algorithm . Genetic algorithm is an adaptive search technology , Its choice 、 cross 、 Operations such as mutation are carried out in a probabilistic way , Increases the flexibility of the search process , And it can converge to the optimal solution with a large probability , It has good global optimization ability . but , Crossover probability 、 Mutation probability and other parameters will also affect the search results and search efficiency of the algorithm , Therefore, how to select the parameters of genetic algorithm is an important problem in its application .
Sum up , Because the overall search strategy and optimization search method of genetic algorithm do not depend on gradient information or other auxiliary knowledge , Only the objective function affecting the search direction and the corresponding fitness function need to be solved , Therefore, genetic algorithm provides a general framework for solving complex system problems . It does not depend on the specific area of the problem , Strong robustness to the types of problems , So it is widely used in various fields , Include : Function optimization 、 Combinatorial optimal production scheduling problem 、 Auto-Control
、 Robotics 、 The image processing ( Image restoration 、 Image edge feature extraction …)、 Artificial life 、 Genetic programming 、 machine learning .

3 The basic flow and implementation technology of genetic algorithm
Basic genetic algorithm (Simple Genetic Algorithms,SGA) Use only the selection operator 、 Crossover operator and mutation operator are three genetic operators , Evolution is simple , It is the basis of other genetic algorithms .

3.1 The basic flow of genetic algorithm
Generate a number of randomly determined lengths ( The length is related to the accuracy of the problem to be solved ) The initial population of coding ;
Each individual is evaluated by fitness function , Individuals with high fitness value were selected to participate in genetic operation , Individuals with low fitness are eliminated ;
Genetically manipulated ( Copy 、 cross 、 variation ) A new generation of population is formed by the collection of individuals , Until the stop criteria are met ( Evolution algebra GEN>=?);
The best realized individual in the offspring is taken as the execution result of the genetic algorithm .
 Insert picture description here
among ,GEN Is the current algebra ;M It's population size ,i Represents the number of populations .

3.2 Implementation technology of genetic algorithm
Basic genetic algorithm (SGA) Coded by 、 Fitness function 、 Genetic operators ( choice 、 cross 、 variation ) And operation parameters .
3.2.1 code
(1) Binary code
The length of binary coded string is related to the accuracy of the problem . We need to ensure that every individual in the solution space can be encoded .
advantage : Ed 、 The decoding operation is simple , inheritance 、 Crossover is easy to achieve
shortcoming : The length is large
(2) Other coding methods
Gray code 、 Floating point code 、 Symbolic encoding 、 Multi parameter coding, etc
3.2.2 Fitness function
The fitness function should effectively reflect the gap between each chromosome and the chromosome of the optimal solution of the problem .
3.2.3 Selection operator
 Insert picture description here
3.2.4 Crossover operator
Cross operation refers to the exchange of some genes between two paired chromosomes in some way , And two new individuals ; Crossover operation is an important feature that distinguishes genetic algorithm from other evolutionary algorithms , Is the main way to produce new individuals . Before crossing, individuals in the group need to be paired , Generally, the principle of random pairing is adopted .
Commonly used crossover :
A single point of intersection
Two point intersection ( Multi-point crossover , The more cross points , The more likely the individual's structure is to be destroyed , Generally, multi-point intersection is not adopted )
Uniform cross
Arithmetic crossover
3.2.5 Mutation operator
The mutation operation in genetic algorithm refers to replacing the gene values at some loci in the individual chromosome coding string with other alleles at this locus , So as to form a new individual .

In terms of the ability to generate new individuals in the operation of genetic algorithm , Cross operation is the main method to generate new individuals , It determines the global search ability of genetic algorithm ; Mutation is only an auxiliary method to generate new individuals , But it is also an essential operation step , It determines the local search ability of genetic algorithm . The combination of crossover operator and mutation operator completes the global search and local search of the search space , Thus, the genetic algorithm can complete the optimization process of the optimization problem with good search performance .

3.2.6 Operation parameters
 Insert picture description here
4 The basic principle of genetic algorithm
4.1 Pattern theorem
 Insert picture description here
4.2 Building block hypothesis
With low order 、 The definition length is short , The pattern whose fitness value is higher than the average fitness value of the population is called gene block or building block .
Building block hypothesis : Individual gene blocks are selected 、 cross 、 The role of genetic operators such as mutation , Can be spliced together , Form individual coding strings with higher fitness .
The building block hypothesis illustrates the basic idea of using genetic algorithm to solve various problems , That is, better solutions can be produced by directly splicing building blocks together .

Two 、 Source code

unction [Zp,Y1p,Y2p,Y3p,Xp,LC1,LC2]=JSPGA(M,N,Pm,T,P)
%----------------------------------------------------------------
% JSPGA.m Flow shop scheduling genetic algorithm
%----------------------------------------------------------------
% Input parameter list
% M Number of genetic evolution iterations
% N Population size ( even numbers )
% Pm Mutation probability
% T m*n Matrix , Storage m Pieces n Processing time of one operation
% P 1*n Vector ,n In one process , Number of machine tools per process
%----------------------------------------------------------------
% Output parameter list
% Zp The most optimal Makespan value
% Y1p In the optimal scheme , Start time of each process of each workpiece , Can be used to draw Gantt charts
% Y2p In the optimal scheme , The end time of each process of each workpiece
% Y3p In the optimal scheme , Machine number used in each process of each workpiece
% Xp The value of the optimal decision variable , The decision variable is a real coded m*n matrix
% LC1 Convergence curve 1, Record of optimal individual fitness of each generation
% LC2 Convergence curve 2, Record of average fitness of each generation
% Finally, the program will draw three pictures : Two convergence curves and Gantt chart of job scheduling
%----------------------------------------------------------------
% First step : Variable initialization
[m,n]=size(T); %m Is the total number of workpieces ,n Is the total number of processes
Xp=zeros(m,n); % Optimal decision variables
LC1=zeros(1,M); % Convergence curve 1
LC2=zeros(1,N); % Convergence curve 2
%----------------------------------------------------------------
% The second step : Random generation of the initial population farm
farm=cell(1,N); % The cell structure is used to store the population
for k=1:N
X=zeros(m,n);
for j=1:n
for i=1:m
X(i,j)=1+(P(j)-eps)*rand; %eps Default eps(1) Express matlab Do what you can express 1 The accuracy of is 1+0.9*eps=1
end
end
farm{k}=X;
end
counter=0; % Set the iteration counter
while counter<M % The stop condition is the maximum number of iterations
%----------------------------------------
% The third step : cross
newfarm=cell(1,N); % The cross generated new species groups exist in
Ser=randperm(N); %randperm(N) return 1:N Random sort vector
for i=1:2:(N-1)
A=farm{Ser(i)}; % Parent individual
B=farm{Ser(i+1)}; % Parent adjacent individuals
Manner=unidrnd(2); % Randomly select the crossing mode ( Crosswise crossing / Longitudinal crossing ) unidrnd(N) return 1:N Discrete random number
if Manner==1
cp=unidrnd(m-1); % Randomly choose the intersection
% Parent Gemini single point intersection
a=[A(1:cp,:);B((cp+1):m,:)]; % Offspring individuals
b=[B(1:cp,:);A((cp+1):m,:)];
else
cp=unidrnd(n-1); % Randomly choose the intersection
a=[A(:,1:cp),B(:,(cp+1):n)]; % Parent Gemini single point intersection
b=[B(:,1:cp),A(:,(cp+1):n)];
end
newfarm{i}=a; % The offspring after crossing are stored in newfarm
newfarm{i+1}=b;
end
% New and old populations merge
FARM=[farm,newfarm];
%----------------------------------------
% Step four : Select replication
FITNESS=zeros(1,2*N);
fitness=zeros(1,N);
plotif=0;
for i=1:(2*N)
X=FARM{i};
Z=COST(X,T,P,plotif); % Call the sub function for calculating the cost , Calculate the fitness
FITNESS(i)=Z;
end
% Selective replication takes the form of pairwise random pairing competition , Have the ability to retain the best individual
Ser=randperm(2*N);
for i=1:N
f1=FITNESS(Ser(2*i-1));
f2=FITNESS(Ser(2*i));
if f1<=f2
farm{i}=FARM{Ser(2*i-1)};
fitness(i)=FITNESS(Ser(2*i-1));
else
farm{i}=FARM{Ser(2*i)};
fitness(i)=FITNESS(Ser(2*i));
end
end
% Record the best individual and convergence curve
minfitness=min(fitness); % find 10 The smallest individual in a population
meanfitness=mean(fitness); %10 The average fitness of a population
LC1(counter+1)=minfitness; % Convergence curve 1, Record of optimal individual fitness of each generation
LC2(counter+1)=meanfitness; % Convergence curve 2, Record of average fitness of each generation
pos=find(fitness==minfitness); % Record 10 The position of the smallest individual in a population
Xp=farm{pos(1)}; % Record the first sorting result of the smallest individual position
%----------------------------------------
% Step five : variation
for i=1:N
if Pm>rand % The probability of variation is Pm
X(I,J)=1+(P(J)-eps)*rand;
farm{i}=X;
end
end
farm{pos(1)}=Xp; % After the mutation , Have the ability to retain the best individual
counter=counter+1;
end
% First step : Variable initialization
[m,n]=size(X);
Y1p=zeros(m,n);
Y2p=zeros(m,n);
Y3p=zeros(m,n);
%----------------------------------------------------------------
% The second step : Calculate the arrangement of the first process
Q1=zeros(m,1);
Q2=zeros(m,1);
R=X(:,1); % Take out the first process
Q3=floor(R); % Round down to obtain the machine number used by each workpiece in the first process
% Next, calculate the start time and end time of the first process of each workpiece
for i=1:P(1) % Take out the machine number
pos=find(Q3==i);% Take out the use No i The machine numbers the workpiece it processes find Return vector / Position in matrix
lenpos=length(pos);
if lenpos>=1
Q1(pos(1))=0;
Q2(pos(1))=T(pos(1),1);
if lenpos>=2
for j=2:lenpos
Q1(pos(j))=Q2(pos(j-1));
Q2(pos(j))=Q2(pos(j-1))+T(pos(j),1);
end
end
end
end
Y1p(:,1)=Q1;
Y2p(:,1)=Q2;
Y3p(:,1)=Q3;
%----------------------------------------------------------------
% The third step : Calculate the arrangement of the remaining processes
for k=2:n
R=X(:,k); % Take out No k Process
Q3=floor(R); % Round down to get the position of each workpiece on the k Machine number used in the next process
% Next, calculate the... Of each workpiece k Start time and end time of the next process
for i=1:P(k) % Take out the machine number
pos=find(Q3==i); % Take out the use No i The machine numbers the workpiece it processes
lenpos=length(pos);
if lenpos>=1
EndTime=Y2p(pos,k-1); % Take out these machines at the end of the last process
POS=zeros(1,lenpos); % The completion time of the last operation is sorted from early to late
for jj=1:lenpos
MinEndTime=min(EndTime);
ppp=find(EndTime==MinEndTime);
POS(jj)=ppp(1);
EndTime(ppp(1))=Inf;
end
% According to the completion time of the previous process , Calculate the... Of each workpiece k Start time and end time of the next process
Q1(pos(POS(1)))=Y2p(pos(POS(1)),k-1);
Q2(pos(POS(1)))=Q1(pos(POS(1)))+T(pos(POS(1)),k);% The end time of the previous workpiece
if lenpos>=2
for j=2:lenpos
Q1(pos(POS(j)))=Y2p(pos(POS(j)),k-1); % The scheduled start time is the end time of the previous process
Q2(pos(POS(j)))=Q1(pos(POS(j)))+T(pos(POS(j)),k);% The end time of the previous workpiece
if Q1(pos(POS(j)))<Q2(pos(POS(j-1))) % If it is earlier than the end time of the previous process
Q1(pos(POS(j)))=Q2(pos(POS(j-1)));
Q2(pos(POS(j)))=Q1(pos(POS(j)))+T(pos(POS(j)),k);% The end time of the previous workpiece
end
end
end
end
end

  • 1.
  • 2.
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
  • 8.
  • 9.
  • 10.
  • 11.
  • 12.
  • 13.
  • 14.
  • 15.
  • 16.
  • 17.
  • 18.
  • 19.
  • 20.
  • 21.
  • 22.
  • 23.
  • 24.
  • 25.
  • 26.
  • 27.
  • 28.
  • 29.
  • 30.
  • 31.
  • 32.
  • 33.
  • 34.
  • 35.
  • 36.
  • 37.
  • 38.
  • 39.
  • 40.
  • 41.
  • 42.
  • 43.
  • 44.
  • 45.
  • 46.
  • 47.
  • 48.
  • 49.
  • 50.
  • 51.
  • 52.
  • 53.
  • 54.
  • 55.
  • 56.
  • 57.
  • 58.
  • 59.
  • 60.
  • 61.
  • 62.
  • 63.
  • 64.
  • 65.
  • 66.
  • 67.
  • 68.
  • 69.
  • 70.
  • 71.
  • 72.
  • 73.
  • 74.
  • 75.
  • 76.
  • 77.
  • 78.
  • 79.
  • 80.
  • 81.
  • 82.
  • 83.
  • 84.
  • 85.
  • 86.
  • 87.
  • 88.
  • 89.
  • 90.
  • 91.
  • 92.
  • 93.
  • 94.
  • 95.
  • 96.
  • 97.
  • 98.
  • 99.
  • 100.
  • 101.
  • 102.
  • 103.
  • 104.
  • 105.
  • 106.
  • 107.
  • 108.
  • 109.
  • 110.
  • 111.
  • 112.
  • 113.
  • 114.
  • 115.
  • 116.
  • 117.
  • 118.
  • 119.
  • 120.
  • 121.
  • 122.
  • 123.
  • 124.
  • 125.
  • 126.
  • 127.
  • 128.
  • 129.
  • 130.
  • 131.
  • 132.
  • 133.
  • 134.
  • 135.
  • 136.
  • 137.
  • 138.
  • 139.
  • 140.
  • 141.
  • 142.
  • 143.
  • 144.
  • 145.
  • 146.
  • 147.
  • 148.
  • 149.
  • 150.
  • 151.
  • 152.
  • 153.
  • 154.
  • 155.
  • 156.
  • 157.
  • 158.
  • 159.
  • 160.
  • 161.
  • 162.
  • 163.
  • 164.
  • 165.
  • 166.
  • 167.
  • 168.
  • 169.

3、 ... and 、 Running results

 Insert picture description here
 Insert picture description here
 Insert picture description here

Four 、 remarks

edition :2014a

版权声明:本文为[Matlab scientific research]所创,转载请带上原文链接,感谢。 https://car.inotgo.com/2021/08/20210810091259338k.html