Introducing an efficient and innovative swarm intelligence algorithm: the Fata Morgana Algorithm (FATA). This algorithm is meticulously designed to tackle complex continuous multi-type optimization problems.
Inspired by the captivating phenomenon of mirage formation, FATA ingeniously implements two key strategies:
- Mirage Light Filtering Principle (MLF): Enhances exploration through innovative filtering techniques.
- Light Propagation Strategy (LPS): Accelerates convergence with effective propagation methods.
The MLF strategy, synergistically combined with the robust definite integration principle, significantly enhances FATAβs remarkable exploration capability. Concurrently, the LPS strategy, artfully integrated with trigonometric principles, empowers each algorithmic individual to markedly improve the algorithm's convergence speed and exceptional exploitation capability.
These two sophisticated search strategies work together to ensure a balanced and effective optimization process.
FATA has been rigorously compared against a diverse range of competitive optimizers across 23 benchmark functions and the IEEE CEC 2014 to validate its outstanding optimization prowess. This groundbreaking work is designed for:
- Comprehensive qualitative analysis
- Exploration and exploitation competency evaluation
- Local optima avoidance strategies
- Extensive comparative experiments
The experimental results compellingly demonstrate the robustness, comprehensiveness, and competitiveness of FATA in effectively solving various multi-type functions. Additionally, FATA has been successfully applied to three challenging practical engineering optimization problems, where it consistently outperforms its counterparts, showcasing its practical utility and effectiveness.
The promising results indicate that FATA possesses remarkable potential as a powerful computer-aided tool for addressing complex practical optimization tasks. Source codes and related files are readily available at the FATA Project Page and other platforms.
- Mirage Light Filtering Principle (MLF): Enhances exploration through innovative filtering techniques.
- Light Propagation Strategy (LPS): Accelerates convergence with effective propagation methods.
- Exploration and Exploitation Competence: Strikes a harmonious balance between exploration and exploitation.
- Avoiding Locally Optimal Solutions: Employs advanced strategies to navigate around local optima.
Explore the captivating visuals related to the FATA optimization process and the fascinating phenomenon of mirages:
To utilize the FATA optimization algorithm, follow these easy steps:
- Set the Objective Function: Define your objective function in the variable
fobj
. - Specify Variable Bounds: Clearly outline the lower (
lb
) and upper (ub
) bounds of the variables. - Configure Problem Parameters: Set the problem dimension, population size (
N
), and maximum number of function evaluations (MaxFEs
). - Execute the Optimization Process: Run the FATA function to initiate the optimization sequence.
- Retrieve Results: The function will return:
- Optimal solution in
bestPos
- Global best score in
gBestScore
- Convergence curve in
cg_curve
- Optimal solution in
Feel free to delve into and leverage the FATA optimization algorithm for your various optimization endeavors. Happy optimizing! ππ FUNCTION FATA(fobj, lb, ub, dim, N, MaxFEs) // Initialize parameters worstInte β 0 bestInte β β gBest β array of zeros with length dim cg_curve β empty array gBestScore β β Flight β initialization(N, dim, ub, lb) // Initialize random solutions fitness β array of infinities with size N FEs β 0 // Function evaluations
// Main loop until maximum function evaluations
WHILE FEs < MaxFEs DO
FOR i FROM 1 TO N DO
// Ensure solutions are within bounds
IF Flight[i] > ub THEN
Flight[i] β ub
ELSE IF Flight[i] < lb THEN
Flight[i] β lb
ENDIF
FEs β FEs + 1
fitness[i] β fobj(Flight[i]) // Evaluate fitness
// Greedy selection for global best
IF gBestScore > fitness[i] THEN
gBestScore β fitness[i]
gBest β Flight[i]
ENDIF
ENDFOR
// Sort fitness to find worst and best
Order, Index β sort(fitness)
worstFitness β Order[N]
bestFitness β Order[1]
// Apply the mirage light filtering principle
Integral β cumulative integral of Order
IF Integral[N] > worstInte THEN
worstInte β Integral[N]
ENDIF
IF Integral[N] < bestInte THEN
bestInte β Integral[N]
ENDIF
IP β (Integral[N] - worstInte) / (bestInte - worstInte + epsilon) // Population quality factor
// Calculate parameters based on iterations
a β tan(-(FEs / MaxFEs) + 1)
b β 1 / tan(-(FEs / MaxFEs) + 1)
// Update flight positions
FOR i FROM 1 TO N DO
Para1 β a * random(dim) - a * random(dim)
Para2 β b * random(dim) - b * random(dim)
p β (fitness[i] - worstFitness) / (gBestScore - worstFitness + epsilon) // Individual quality factor
IF random() > IP THEN
Flight[i] β random(dim) * (ub - lb) + lb // Randomly initialize
ELSE
FOR j FROM 1 TO dim DO
num β floor(random() * N + 1)
IF random() < p THEN
Flight[i][j] β gBest[j] + Flight[i][j] * Para1[j] // Light refraction (first phase)
ELSE
Flight[i][j] β Flight[num][j] + Para2[j] * Flight[i][j] // Light refraction (second phase)
Flight[i][j] β (0.5 * (arf + 1) * (lb[j] + ub[j]) - arf * Flight[i][j]) // Total internal reflection
ENDIF
ENDFOR
ENDIF
ENDFOR
cg_curve[it] β gBestScore
it β it + 1
bestPos β gBest
ENDWHILE
- Ailiang Qi
- Dong Zhao
- Ali Asghar Heidari
- Lei Liu
- Yi Chen
- Huiling Chen
June 8, 2024
We welcome your inquiries! Feel free to reach out to us via email:
- Ailiang Qi: [email protected]
- Ali Asghar Heidari: [email protected], [email protected]
- Huiling Chen: [email protected]
If you utilize this code, please ensure to cite the primary paper on FATA:
Title: FATA: An Efficient Optimization Method Based on Geophysics
Journal: Neurocomputing
Year: 2024
DOI: 10.1016/j.neucom.2024.128289
For comparative analysis, consider exploring these notable optimization methods developed recently:
- (ECO) 2024: An advanced optimization framework that enhances performance across various problem classes. π ECO Project Page
- (AO) 2024: A novel approach to optimization challenges, boasting improved convergence rates and robustness. π AO Project Page
- (PO) 2024: A powerful optimization methodology focused on solving complex multi-modal functions efficiently. π PO Project Page
- (RIME) 2023: A cutting-edge algorithm for robust optimization, designed to handle noisy and dynamic environments. π RIME Project Page
- (INFO) 2022: An innovative optimization technique that integrates information theory concepts for enhanced search capabilities. π INFO Project Page
- (RUN) 2021: A dynamic optimization solution that adapts to changing landscapes and varying problem dimensions. π RUN Project Page
- (HGS) 2021: A high-performance optimization strategy that excels in exploring large and complex search spaces. π HGS Project Page
- (SMA) 2020: A sophisticated optimization method that mimics collaborative behaviors in nature for problem-solving. π SMA Project Page
- (HHO) 2019: A pioneering optimization algorithm inspired by nature, specifically the hunting behavior of Harris hawks. π HHO Project Page