Flavours of Group Search Optimizer: A Survey

ISSN:2229-6093
Sebabrata Ghosh et al, Int.J.Computer Technology & Applications,Vol 6 (2),352-358
Flavours of Group Search Optimizer: A Survey
Joshil Raj
Ph.D Scholar, Dept of
CSE,PondicherryUniversity
[email protected]
S.SivaSathya
Associate Professor, Dept of
CSE,PondicherryUniversity
[email protected]
Sebabrata Ghosh
M.TechStudent, Dept of
CSE,PondicherryUniversity
[email protected]
Abstract
Group search optimizer (GSO) is a novel bio inspired
optimization algorithm imitating the foraging
behavior of animals. Producerโ€“Scrounger (PS)
model has been used for achieving the best searching
strategies in the execution of the algorithm thereby
making GSO simple and powerful optimizing
technique and hence several researchers in the recent
past have come up with different versions of GSO for
varied applications. This paper provides a broad
overview of GSO and then delves into an exhaustive
survey of the different variations & modifications of
GSO proposed in various published research works.
The paper also covers the different application
domains where GSO & its variations/modifications
have been successfully used till now.
several variations & modifications of the GSO
algorithm that has been bought forward by
researchers and this paper surveys the use of GSO &
its variations/modifications in solving several real life
problems in different application domains.
Keywords:Group search optimization, PSO, MultiProducer, Cooperative learning, Multi-objective,
Artificial Fish Swarm.
All The members of the animal population
are organized in the form of groups in the GSO
algorithm and each individual of the group is referred
to as a member of the group. Three different kinds of
members form the group namely:
1. Introduction
Bio-inspired population-based methods are
increasingly becoming popular in solving large
number of optimization problems. The most
commonly used ones are Genetic Algorithm (GA),
Differential Evolution Algorithm (DE), Artificial
Neural Network (ANN), Ant Colony Optimization
(ACO), Bee Colony Optimization (BCO) and Particle
Swarm Optimization (PSO). Group Search Optimizer
(GSO) is a novel bio-inspired heuristic algorithm
imitating the behaviour of a population of animals
foraging for food resources & the same has been
widely used for solving several optimization
problems [1]. In the animal kingdom, among a
population of animals, individual animal member
benefits from the process of sharing of meaningful
information between the members of the group. GSO
uses the Producer-Scrounger (PS) methodology for
the search of food by the members of the population
[1] [2]. Since the successful implementation of GSO
algorithm,
there
have
been
IJCTA | Mar-Apr 2015
Available [email protected]
The paper is organized in the following
sections for easier understanding and assimilation:
Section 2 provides an overview of GSO. The
different variations/modifications of GSO are given
in section 3 and the use of GSO in different
application domains is explored in Section 4.
Conclusion is drawn in section 5.
2. Group Search Optimizer
1. Producer are members that performs the
producing strategies in the search for food. In
every iteration, the producer member has the
best fitness value and is thus located in the
most promising area;
2. Scrounger members are involved in task of
performing scrounging strategies i.e. joining
the food resources discovered by others
members;
3. Ranger members are involved in
performing random walks for the search of
resources or search for randomly distributed
food resources in the search space.
The producer will scan randomly in its field of
scanning [3] and will move to a location with a better
resource or it will stay in its current location and turn
the head angle. All scroungers will move towards the
producer to find a location having batter food
resources. If the food resource found by the
scrounger is better than the one held by the current
producer, then scrounger will become producer for
352
ISSN:2229-6093
Sebabrata Ghosh et al, Int.J.Computer Technology & Applications,Vol 6 (2),352-358
the next iteration thereby maintaining the
requirement of having the best member as the
producer. The rangers move to a random distance
with a random head angle. Whenever a GSO member
moves outside the search space, it is turned back to
its previous position inside the bounded search space.
3. Different Types of GSO
Some modifications and variations on the basic
algorithm of GSO has been proposed by different
researchers to take advantage of the search capability
of GSO algorithm and to make it amenable for
performance of a particular task. The modifications/
variations proposed over the GSO algorithm can be
broadly classified as Hybrid GSO algorithms and
Modified GSO algorithms. Hybrid GSO algorithms
are combinations of GSO algorithm with others
algorithms for example the FSGSO and GSPSO.
FSGSO is the combination of GSO algorithm and
Fish Swarm Algorithm and likewise GSPSO is the
combination of GSO and PSO algorithms.In the
GSPSO, the PSO algorithm is used for global
optimization and the GSO algorithm is employed for
optimization in the local search space. Switching
between the two algorithms is also done. Modified
GSO algorithms are simple modifications over the
GSO algorithm. Some of the modified GSO
algorithms reported in the research publications are
MPGSO, NMGSO, CGSO and QGSO. Traditional
GSO has a single producer however MPGSO uses
multi producers. In NMGSO, the scanning strategy is
performed by the limited pattern search thereby
increasing the computational efficiency and
improving the performance. CGSO uses cooperative
behaviour between different GSO groups to improve
the performance. QGSO increase the number of
rangers in the algorithm and uses the search strategy
of PSO algorithm. The hierarchy of different types of
GSO presented in Figure 1.This section deals with
brief description of the different types GSO.
GSO
Hybrid
GSO
FSGSO
Modified
GSO
GSPSO
MPGSO
NMGSO
CGSO
QGSO
3. 1FSGSO
Oliveira, Pacifico and Ludermir [4]
proposed Fish Swarm Group Search Optimizer
(FSGSO) which is a hybrid Group Search
Optimization based on the behaviors of fish swarms.
The hybridization was done by effecting a change in
the method of scrounging procedures. The group
members nominated as scroungers moved in the
search space following the behaviors of fish swarms
like Swarm, Follow, Prey and Leap from the
Artificial Fish Swarm Algorithm (AFSA) [5]. In
FSGSO, the producer maintains its search procedures
similar to the original GSO, however the search
strategy of scroungers follows the search procedures
of fish swarm. In the traditional GSO scrounging
strategy, the scroungers move towards the producer
randomly in each iteration, however in FSGSO the
scroungers moves around in the search space taking
into consideration the location of the producer, the
center of the group and also performing random
walks. In this method, the scrounging is performed
through the execution of two behaviors from the
AFSA: Swarm and Follow. The Follow and Swarm
behaviors are executed and the one that achieves
better fitness is used to update the position of the
scrounger. The follow behavior represents the ability
to follow the producer given that the current state
conditions are accepted. In the Follow behavior the
selected scrounger will follow the producer if the
๐‘›๐‘“
region is not crowded ( ๐‘ <๐œŽ) and the best position
has better fitness than the present location (๐‘Œ๐‘ > ๐‘Œ๐‘– ),
then the position is updated using eq.(1) & if the
condition is not satisfied then eq (2) is used for
updating the positions of the scrounger.
๐‘‹๐‘–๐‘˜ +1 = ๐‘‹๐‘–๐‘˜ + ๐‘Ÿ๐‘Ž๐‘›๐‘‘()๐‘Ÿ1 ๐‘ ๐‘ก๐‘’๐‘
๐‘‹๐‘ โˆ’๐‘‹๐‘–๐‘˜
(1)
๐‘‹๐‘–๐‘˜ +1 = ๐‘ƒ๐‘Ÿ๐‘’๐‘ฆ(๐‘‹๐‘–๐‘˜ )
(2)
The swarm behavior will gather information
from all the members of the group in order to
determine a central position, which will influence the
movement of the scrounger. In the Swarm behavior
the update process is done if the region is not
๐‘›๐‘“
crowded ( ๐‘ <๐œŽ) and the central position is in a better
food concentration region than the current position
(๐‘Œ๐‘ > ๐‘Œ๐‘– ) then the position is updated using on eq.(3)
and if the condition is not satisfied then the scrounger
uses the prey procedure as per eq (4) for updating its
position.
๐‘‹๐‘–๐‘˜ +1 = ๐‘‹๐‘–๐‘˜ + ๐‘Ÿ๐‘Ž๐‘›๐‘‘()๐‘Ÿ1 ๐‘ ๐‘ก๐‘’๐‘
Figure 1: Different Types of GSO algorithms
๐‘‹๐‘–๐‘˜ +1 = ๐‘ƒ๐‘Ÿ๐‘’๐‘ฆ(๐‘‹๐‘–๐‘˜ )
IJCTA | Mar-Apr 2015
Available [email protected]
๐‘‹๐‘ โˆ’๐‘‹๐‘–๐‘˜
๐‘‹๐‘ โˆ’๐‘‹๐‘–๐‘˜
๐‘‹๐‘ โˆ’๐‘‹๐‘–๐‘˜
(3)
(4)
353
ISSN:2229-6093
Sebabrata Ghosh et al, Int.J.Computer Technology & Applications,Vol 6 (2),352-358
In the Prey behaviour, the scrounger tries to find a
better position inside its visual field. If after a number
of tries (trynumber), the scrounger succeeds, the
update will be performed as shown in eq. (5). If the
scrounger does not find a neighbor with better fitness
after a number of tries, it will move randomly as
shown in eq. (6).
๐‘‹๐‘–๐‘˜ +1 = ๐‘‹๐‘–๐‘˜ + ๐‘Ÿ๐‘Ž๐‘›๐‘‘()๐‘Ÿ1 ๐‘ ๐‘ก๐‘’๐‘
๐‘‹ ๐‘— โˆ’๐‘‹๐‘–๐‘˜
๐‘‹ ๐‘— โˆ’๐‘‹๐‘–๐‘˜
(5)
(6)
๐‘‹๐‘–๐‘˜ +1 = ๐ฟ๐‘’๐‘Ž๐‘(๐‘‹๐‘–๐‘˜ )
The Leap behaviour is characterized by random
movements from the scrounger (eq.7)
(7)
๐‘‹๐‘–๐‘˜ +1 = ๐‘‹๐‘–๐‘˜ + ๐‘Ÿ๐‘Ž๐‘›๐‘‘()๐‘Ÿ1 ๐‘ ๐‘ก๐‘’๐‘
All the other movements of producers and rangers are
the same as used in the traditional GSO.
The FSGSO is evaluated over eight benchmark
functions and compared to DE, PSO and standard
GSO algorithms. The FSGSO achieved the best
overall performance in comparison with the GSO,
PSO and DE.
3. 2FSGSO
Yan and Shi [6] are introduced a novel
hybrid algorithm called Group Search Particle Swarm
Optimization (GSPSO) using both the Particle
Swarm Optimization (PSO) and the Group Search
Optimization (GSO) algorithms. The PSO algorithm
finds a good local search space for the problem and
in the GSO algorithm uses the convergence by
scroungers and rangers to update the local search
space.PSO algorithm is again used to find a smaller
and better local search space iteratively and the use of
GSO or PSO is used by switching. The best position
is found out by the process of mutual correction by
both PSO and GSO. The GSPSO algorithm is present
in Figure 2
1. Randomly initialize the particles with position,
velocities and head angles.
2. The particle with best fitness value is nominated
as the producer
3. Check the value of GP to determine the use of
PSO or GSO.
4. Remove members with poor fitness value.
5. Calculate the switch rates of PSO & GSO.
6. pbest position of each member is taken as the
initial position and g best is taken as the position
of the producer.
7. Remove members with poor fitness value.
8. Initialize the velocities for each member
randomly.
9. Update the pbest of each member.
10. Check for terminal conditions
11. End.
Figure 2: Pseudo code for the GSPSO algorithm
IJCTA | Mar-Apr 2015
Available [email protected]
The GSPSO is run over four benchmark functions for
30 and 300 dimension dataset. It has been compared
with GSO, PSO, QGSOPC and CBPSO algorithms.
3. 3MPGSO
A. B. M. Junaed, M. A. H. Akhand, AlMahmud and K. Murase [7] proposed a multiproducer GSO (MPGSO) based on multiple
producers. In MPGSO, at first a top x% members is
selected as producers and a roulette wheel is made
using these producers. For each of the scrounger, a
producer is selected from the roulette wheel, and the
scrounging strategy is performed towards this
producer. MPGSO algorithm produced optimal
solutions within a less number of iteration than the
original GSO algorithm while solving benchmark
functions. The MPGSO algorithm is presented in
Figure 3.
1. Randomly initialize the members of the
population & generate their fitness values.
2. while terminal conditions are not met do
3. for each members i in the group do
a. Find the top x% members as producers of
the group.
b. Create roulette wheel for selecting a
producers from the group of scroungers.
c. The rest of the members (rangers) will
perform ranging.
d. Calculate the fitness value for each
member again.
4. end for
5. end while
Figure 3: Pseudo code for the MPGSO algorithm
The MPGSO is tested by eight benchmark functions
for 2 and 30 dimension dataset and compared to
traditional GSO algorithms. MPGSO is achieved best
results for each of the problems.
3. 4 NMGSO
Wang, Zhong and Liu [8] proposed a novel
modification to GSO for multi-objective optimization
problems as NMGSO. Unlike the original GSO,
where each member has to keep the record of its head
angle and do a Polar to Cartesian coordinate
transformation while updating its position, in
NMGSO, the scanning strategy for each member is
replaced with a limited pattern search (LPS) [8][9]
which is more efficient the high-dimensional
optimization problems. The limited pattern search
procedure simplifies the search behavior and also
enhances the capability of local search. In traditional
GSO, the ranger search for the new position based on
a random head angle and distance but in NMGSO,
the global search behavior of the rangers is modified
354
ISSN:2229-6093
Sebabrata Ghosh et al, Int.J.Computer Technology & Applications,Vol 6 (2),352-358
with a controlling probability thereby introducing
randomness and enhancing the diversity of members.
In multi-objective optimization problems, it
is not easy to choose the best member to be the
producer since there is no dominance relationship
between the various objective functions and may
require multiple producers to guide the groups of
members. Hence more number of producers is set as
per the number of the objectives in the multi
objective function. For every objective in multi
objective function problem, the one group member
with the single-objective extreme is selected as the
producer and these producers optimize the respective
objective with the limited pattern search. The
NMGSO is tested by seven benchmark functions
with two objective functions and compared to
MOGSO and MOPSO algorithms. NMGSO has
achieved best results for each of these problems and
the convergence property of the NMGSO along with
its ability to find a diverse set of solutions is the best
among the other algorithms. NMGSO is more
suitable as compared to GSO for solving the multiobjective optimization problems.
3. 5CGSO
L. D. S. Pacifico and T. B. Ludermir[10]
introduced a novel GSO approach based on
cooperative behaviour among groups, called
Cooperative Group Search Optimizer (CGSO).The
modified algorithm works with a set of groups of
members, where each group manipulates a limited set
of the problem variables and the solutions of each
group is combined to get the optimized solution for
composite problem. The population members are
initially divided into k independent groups and each
group is then associated with d dimensions from the
search space (where d x k = n). Each group will
execute local searches seeking to minimize its own
set of variables. The current fitness for each member
of the groups is evaluated as a concatenation of its
own variables with the best contributions by all the
groups,at the corresponding positions. The CGSO
algorithm is present in Figure 4.
1. for each group ๐บ๐‘– do
a. Randomly generate the members &
their fitness values.
2. end for
3. while the terminal conditions are not met
do
4. for each group ๐บ๐‘– d
a. Nominate the producer.
b. Nominate scroungers & perform
scrounging.
c. Nominate rangers & perform
ranging.
d. Regenerate fitness function for
each member.
5. endfor
6. end while
Figure 4: Pseudo code for the CGSO algorithm
3. 6QGSO
Li and Liu [11] proposed a quick group
search optimizer (QGSO) for structural optimization.
The QGSO has three main features:
(a) Increase in quantity of rangers at certain
points of the algorithm, when the
optimization stops going forward.
(b) The search strategy is similar to that of
PSO algorithm wherein the group best and
the personal best are considered. Every
scrounger moves forward by a random walk
and does not just follow the producer in
every iteration. They also consider the best
position they have ever been during the
move.
(c) The rangers are produced according to
the group best and the personal best.
The QGSO algorithm is present in Figure 5.
The CGSO was evaluated in nine well known
benchmark functions for 30-dimension search space
problems and compared to standard GSO, PSO and
DE algorithms. Experimental results display that the
CGSO is more efficient than the standard GSO
algorithm.
IJCTA | Mar-Apr 2015
Available [email protected]
The QGSO was evaluated in five benchmark pinconnected structures for Truss Structures with
Discrete Variables and Truss Structures with
Continuous Variables and compared to standard
GSO, HPSO algorithms. QGSO algorithm has less
computational complexity, better convergence rate
and accuracy as compared to other algorithms that it
was compared with such as standard GSO and HPSO
algorithms.
355
ISSN:2229-6093
Sebabrata Ghosh et al, Int.J.Computer Technology & Applications,Vol 6 (2),352-358
1. Randomly initialize velocities and positions of
all member.
2. for each particle i in the group do
3. while the conditions are met do
a. Randomly regenerate the current
particle ๐‘‹๐‘–
4. end while
5. end for
6. whilethe terminal conditions are not met do
7. for each member i in the group do
a. Calculate the fitness value.
b. Nominate the producer, scroungers &
rangers.
c. Use the pbest update the value of the
member.
d. Calculate the fitness value ๐‘“ ๐‘‹๐‘–๐‘˜ of the
particle.
e. Update pbest.
f. Update gbest.
8. endfor
9. end while
Figure 5: Pseudo code for the QGSO algorithm
4. Application of GSO
GSO has effectively been used for solving
various real life problems & has proved to be an
efficient algorithm for solution of some practical
applications domain problems. In He et al. [12], the
weights of ANN was training using the GSO
algorithm and hybrid algorithm was used for the
classification of breast cancer dataset. Wu et al. [13]
used a GSO with multiple producers (GSOMP) for
the optimal placement of multi-type Flexible AC
Transmission System (FACTS) devices in a power
system, thereby successfully solving a multiobjective optimization problem. He and Li [14] used
a hybrid combination of ANN algorithm trained with
GSO algorithm for the monitoring of machine
condition. Silva et al. [15] used GSO to optimize
input weights and hidden biases of Extreme Learning
Machine (ELM) neural network. The other major
applications of the GSO algorithm are listed in Table
I.
5. Conclusion
Several computational intelligence solutions have
been proposed to solve and optimize the difficult
continuous optimization problems. But algorithms
inspired from the natural behavior yields special
attention for its performance.This paper provides an
overview of basic
IJCTA | Mar-Apr 2015
Available [email protected]
GSO and different types of modified GSO (FSGSO,
CGSO, MPGSO, GSPSO, NMGSO and QGSO).
This paper also lists some of the applications of bioinspired population-based algorithm GSO in various
fields for accomplishing various tasks. Some
modifications and variations on the basic algorithm
of GSO have been proposed by different researchers
to improve the performance of GSO algorithm and
solving several real life problems in different
application domains. GSO technique and its
numerous modifications provide a number of ways
for solving the real world problems more efficiently
and quickly.
6. References
[1] S. He, H. Wu and J. R. Saunders, " Group Search
Optimizer: An Optimization Algorithm Inspired by Animal
Searching Behaviour ", in: IEEE Transactions on
Evolutionary Computation, vol. 13, no. 5, pp. 973-990,
2009.
[2] He S, Wu QH, Saunders JR. A novel group search
optimizer inspired by animal behavioural ecology. In:
Proceedings of the IEEE international conference on
evolutionary computation. BC, Canada: Vancouver; p.
1272โ€“8,2006.
[3] W. J. Oโ€™Brien, B. I. Evans, and G. L. Howick. A new
view of the predation cycle of a planktivorous fish, white
crappie (pomoxisannularis). Can. J. Fish. Aquat. Sci.,
43:1894โ€“1899, 1986.
[4]Joao F. L. Oliveira, Luciano D. S. Pacifico, Teresa B.
Ludermir,โ€ A Hybrid Group Search Optimization Based on
Fish Swarmsโ€, IEEE 2013 Brazilian Conference on
Intelligent Systems,I 2013
[5] L. Li, Z. J. Shao, and J. Qian, โ€œAn optimizing method
based on autonomous animate: Fish swarm algorithm,โ€
Proceeding of System Engineering Theory and Practice,
vol. 11, pp. 32โ€“38, 2002.
[6] X. Yan and H. Shi, "A Hybrid Algorithm Based on
Particle Swarm Optimization and Group Search
Optimization", in: 2011 Seventh Conference on Natural
Computing (ICNC), IEEE,vol. 1, pp.13-17, 2011
[7] A. B. M. Junaed, M. A. H. Akhand, Al-Mahmud, K.
Murase, โ€ Multi-Producer Group Search Optimizer for
Function Optimizationโ€, International Conference on
Informatics, Electronics & Vision (ICIEV),IEEE, pp.14,2013
[8] Ling Wang, Xiang Zhong, Min Liu, โ€œ A novel group
search optimizer for multi-objective optimizationโ€, Expert
Systems with Applications,vol.39, pp.2939โ€“2946,2012
[9] Zhang, W. F., Teng, S. H., & Li, L. J,โ€Improved group
search optimizer algorithmโ€, Computer Engineering and
Applications, 45(4), pp.48โ€“51,2009.
356
ISSN:2229-6093
Sebabrata Ghosh et al, Int.J.Computer Technology & Applications,Vol 6 (2),352-358
Algorithms Used
TABLE I
Use of GSO in various application domains
Application
Data used
Group Search Optimizer and
ANN[14]
GSO was effective in optimizing ANN used for
Machine Condition Monitoring.
Ultrasound Data
Records.
GSO and Discrete operators
[16]
GSO was used for solving Hybrid Flow shop
Scheduling Problem.
Ten benchmark
problems
GSO [17]
GSO was used for Clustering of Complex Networks
and Community Detection
Five social network
datasets
Swap Operator (SO), Swap
Sequence (SS),GSO [18]
GSO is used to solve the Traveling Salesman
Problem.
15 benchmark problems
PSO and GSO [19]
GSO is used for optimal structural design like 72bar spatial structure etc.
Structural design dataset.
Group Search Optimizer
Algorithm[20]
GSO is used for complex structural optimization
problems. GSO was used for solving structural
design probems in civil construction designing.
Five typical benchmark
structural optimization
GSO and harmony
search mechanism[21]
The application of improved GSO for structural
designing of trusses.
Structural truss design
dataset
GSO [22]
GSO is used for the Multicast Routing Problem
Network topology
consisting of 60 nodes
Evolutionary strategy and
GSO[23]
GSO is used for mechanical design optimization
problems.
Spring design and
mechanical design data.
GSO [24]
GSO is used for supervised Classification & feature
selection.
Colon Cancer dataset,
Leukemia dataset, Breast
cancer dataset
GSO, Binary searching
tool[25]
Group search optimization is used for optimization
of network design in an electricity distribution
network.
Electricity distribution
network data.
GSO and Invasive Weed
Optimization (IWO) [26]
GSO is used for Multimodal optimization problem
21 multimodal general
functions , 9
composition functions
PSO & GSO [27]
GSO is used for optimizing location and capacity of
distributed electricity generation & distribution
problem.
Electricity distribution &
generation dataset.
GSO [28]
GSO is used to train the neural networks.
23 benchmark functions.
GSO [29]
GSO is used for solve the Optimal Power Flow
Problem.
Electricity distribution
network data.
GSO and limited pattern
search procedure[30]
Group search optimizer is used for multi-objective
optimization
7 benchmark functions
Group Search Optimizer and
ANN[31]
GSO is used for diagnosis of breast cancer.
breast cancer dataset
IJCTA | Mar-Apr 2015
Available [email protected]
357
ISSN:2229-6093
Sebabrata Ghosh et al, Int.J.Computer Technology & Applications,Vol 6 (2),352-358
[10] L. D. S. Pacifico, T. B. Ludermir, โ€ Cooperative
Group Search Optimizationโ€
IEEE Congress on
Evolutionary Computation,pp. 3299 โ€“ 3306,2013.
[11]Lijuan Li, Feng Liu, โ€œOptimum Design of Structures
with Quick Group Search Optimization Algorithmโ€
Adaptation, Learning, and Optimization, Springer Berlin
Heidelberg, vol. 9 pp 161-206, 2011.
[12] S. He, Q. H. Wu and J. R. Saunders, โ€œBreast Cancer
Diagnosis Using an Artificial Neural Network Trained by
Group Search Optimizerโ€, in: Transactions of the
Institute of Measurement and Control, vol. 31, no. 6, pp.
517-531, 2009
[13] Q. H. Wu, Z. Lu, M. S. Li and T. Y. Ji, โ€œOptimal
Placement of FACTS Devices by a Group Search
Optimizer with Multiple Producersโ€, in: 2008 Congress
on Evolutionary Computation (CEC 2008) , Hong Kong,
pp. 1033-1039, 2008.
[14] S. He and X. Li, โ€œApplication of a group search
optimization based artificial neural network to machine
condition monitoring,โ€ in Emerging Technologies and
Factory Automation, 2008. ETFA 2008. IEEE International Conference on . IEEE, 2008, pp. 1260โ€“1266.
[15] D. N. G. Silva, L. D. S. Pacífico and T. B. Ludermir,
โ€œAn Evolutionary Extreme Learning Machine Based on
Group Search Optimizationโ€, in : Proceedings of the
2011 Congress on Evolutionary Computing (CEC 2011),
New Orleans, USA, pp. 2297-2304, 2011.
[16] Zhe C ui , XingshengGu , โ€A Discrete Group Search
Optimizer for Hybrid Flowshop Scheduling Problem
with Random Breakdownโ€ , Mathematical Problems in
Engineering,2014
[17] G. Kishore Kumar, Dr. V. K. Jayaraman,
โ€œClustering of Complex Networks and Community
Detection Using Group Search Optimizationโ€
[18] M. A. H. Akhand, A. B. M. Junaed,Md. Forhad
Hossain, K. Murase ,โ€Group Search Optimization to
solve Traveling Salesman Problemโ€,15th ICCIT ,
University of Chittagong, pp.22-24,2012
[19] S.K. Zeng, L.J. Li,โ€ Particle Swarm-Group Search
Algorithm and its Application to Spatial Structural
Design with Discrete Variablesโ€, International Journal of
Optimization In Civil Engineering,vol.2,pp. 443458,2012
[20] ]Lijuan Li, Feng Liu, โ€œOptimum Design of
Structures with Group Search Optimizer Algorithmโ€
Adaptation, Learning, and Optimization, vol. 9 pp.6995,2011
[21] Lijuan Li, Feng Liu, โ€œImprovements and
Applications of Group Search Optimizer in Structural
Optimal
Designโ€
Adaptation,
Learning,
and
Optimization, vol. 9 pp.97-157,2011
[22] BeenaHassina C, Sathya M., โ€A Multi-objective
Optimization Strategy based on GSO for the Multicast
Routing Problemโ€, International Journal of Advanced
Research in Computer Science and Software
Engineering,vol.2,pp.326-333
IJCTA | Mar-Apr 2015
Available [email protected]
[23] Shen, Hai, Yunlong Zhu, Ben Niu, and Q. H.
Wu.
"An improved group search optimizer
formechanical design optimization problems." Progress
in Natural Science 19, no. 1 (2009): 91-97.
[24] DattatrayaMagatrao, Shameek Ghosh, V K
Jayaraman,PatrickSiarry, โ€œSimultaneous Gene Selection
and Cancer Classification using a Hybrid Group Search
Optimizerโ€, GECCOโ€™13,2013
[25] Saeed Teimourzadeh, KazemZare , "Application of
binary group search optimization to distribution network
reconfigurationโ€, Electrical Power and Energy Systems
62,pp.461โ€“468,2014
[26] Subhrajit Roy, Sk. Minhazul Islam, Swagatam Das,
Saurav Ghosh,โ€ Multimodal optimization by artificial
weed colonies enhanced with localized group search
optimizersโ€, Applied
Soft
Computing 13,pp.27โ€“
46,2013
[27] Qi Kang, TianLan, Yong Yan, Lei Wang,
QidiWu,โ€Group search optimizer based optimal location
and capacity of distributed generationsโ€ Neurocomputing
78,pp.55โ€“63, 2012
[28] Xingdi Yan, Wen Yang, Hongbo Shi ,โ€A group
search optimization based on improved small world and
its application on neural network training in ammonia
synthesisโ€, Neurocomputing 97, pp.94โ€“107, 2012
[29] Yi Tan, CanbingLi,Yijia Cao, Kwang Y. Lee,
Lijuan Li, Shengwei Tang, Lian Zhou,โ€ Improved Group
Search Optimization Method for Optimal Power Flow
Problem Considering Valve-Point Loading Effectsโ€,
Neurocomputing,2013
[30] Ling Wang, Xiang Zhong, Min Liu, โ€œ A novel group
search optimizer for multi-objective optimizationโ€,
Expert Systems with Applications,vol.39, pp.2939โ€“2946,
2012.
[31] S. He, Q. H. Wu and J. R. Saunders, โ€œBreast Cancer
Diagnosis Using an Artificial Neural Network Trained by
Group Search Optimizerโ€, in: Transactions of the
Institute of Measurement and Control, vol. 31, no. 6, pp.
517-531, 2009.
358