All Projects → jaswinder9051998 → zoofs

jaswinder9051998 / zoofs

Licence: Apache-2.0 license
zoofs is a python library for performing feature selection using a variety of nature-inspired wrapper algorithms. The algorithms range from swarm-intelligence to physics-based to Evolutionary. It's easy to use , flexible and powerful tool to reduce your feature size.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to zoofs

biteopt
Derivative-Free Optimization Method for Global Optimization (C++)
Stars: ✭ 91 (-35.92%)
Mutual labels:  genetic-algorithm, evolutionary-algorithms, optimization-methods, optimization-tools, optimization-algorithms
Solid
🎯 A comprehensive gradient-free optimization framework written in Python
Stars: ✭ 546 (+284.51%)
Mutual labels:  optimization, genetic-algorithm, machine-learning-algorithms, optimization-algorithms
GeneticAlgorithmForFeatureSelection
Search the best feature subset for you classification mode
Stars: ✭ 82 (-42.25%)
Mutual labels:  genetic-algorithm, feature-selection, evolutionary-algorithms, machinelearning
Pagmo2
A C++ platform to perform parallel computations of optimisation tasks (global and local) via the asynchronous generalized island model.
Stars: ✭ 540 (+280.28%)
Mutual labels:  optimization, genetic-algorithm, evolutionary-algorithms, optimization-algorithms
Optim
OptimLib: a lightweight C++ library of numerical optimization methods for nonlinear functions
Stars: ✭ 411 (+189.44%)
Mutual labels:  optimization, evolutionary-algorithms, optimization-algorithms
geneticalgorithm2
Supported highly optimized and flexible genetic algorithm package for python
Stars: ✭ 36 (-74.65%)
Mutual labels:  genetic-algorithm, evolutionary-algorithms, optimization-algorithms
Geneticalgorithmpython
Source code of PyGAD, a Python 3 library for building the genetic algorithm and training machine learning algorithms (Keras & PyTorch).
Stars: ✭ 435 (+206.34%)
Mutual labels:  optimization, genetic-algorithm, evolutionary-algorithms
evoli
Genetic Algorithm and Particle Swarm Optimization
Stars: ✭ 22 (-84.51%)
Mutual labels:  genetic-algorithm, evolutionary-algorithms, particle-swarm-optimization
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (+263.38%)
Mutual labels:  optimization, genetic-algorithm, machine-learning-algorithms
Jenetics
Jenetics - Genetic Algorithm, Genetic Programming, Evolutionary Algorithm, and Multi-objective Optimization
Stars: ✭ 616 (+333.8%)
Mutual labels:  optimization, genetic-algorithm, evolutionary-algorithms
Scikit Opt
Genetic Algorithm, Particle Swarm Optimization, Simulated Annealing, Ant Colony Optimization Algorithm,Immune Algorithm, Artificial Fish Swarm Algorithm, Differential Evolution and TSP(Traveling salesman)
Stars: ✭ 2,791 (+1865.49%)
Mutual labels:  optimization, genetic-algorithm, particle-swarm-optimization
GARI
GARI (Genetic Algorithm for Reproducing Images) reproduces a single image using Genetic Algorithm (GA) by evolving pixel values.
Stars: ✭ 41 (-71.13%)
Mutual labels:  genetic-algorithm, evolutionary-algorithms, optimization-algorithms
bio ik
MoveIt kinematics_base plugin based on particle optimization & GA
Stars: ✭ 104 (-26.76%)
Mutual labels:  optimization, genetic-algorithm, particle-swarm-optimization
goga
Go evolutionary algorithm is a computer library for developing evolutionary and genetic algorithms to solve optimisation problems with (or not) many constraints and many objectives. Also, a goal is to handle mixed-type representations (reals and integers).
Stars: ✭ 39 (-72.54%)
Mutual labels:  optimization, genetic-algorithm, evolutionary-algorithms
Pygmo2
A Python platform to perform parallel computations of optimisation tasks (global and local) via the asynchronous generalized island model.
Stars: ✭ 134 (-5.63%)
Mutual labels:  optimization, evolutionary-algorithms, optimization-algorithms
Ascension
A metaheuristic optimization framework
Stars: ✭ 24 (-83.1%)
Mutual labels:  genetic-algorithm, optimization-methods, optimization-algorithms
Free Ai Resources
🚀 FREE AI Resources - 🎓 Courses, 👷 Jobs, 📝 Blogs, 🔬 AI Research, and many more - for everyone!
Stars: ✭ 192 (+35.21%)
Mutual labels:  machine-learning-algorithms, supervised-learning, machinelearning
geneal
A genetic algorithm implementation in python
Stars: ✭ 47 (-66.9%)
Mutual labels:  optimization, genetic-algorithm, optimization-algorithms
Eaopt
🍀 Evolutionary optimization library for Go (genetic algorithm, partical swarm optimization, differential evolution)
Stars: ✭ 718 (+405.63%)
Mutual labels:  optimization, genetic-algorithm, evolutionary-algorithms
Heart disease prediction
Heart Disease prediction using 5 algorithms
Stars: ✭ 43 (-69.72%)
Mutual labels:  machine-learning-algorithms, supervised-learning, machinelearning

zoofs Logo Header

🐾 zoofs ( Zoo Feature Selection )

Maintainability Rating Reliability Rating Security Rating <Sonarcloud quality gate> DOI PyPI version Downloads codecov Open In Colab Binder Gitter

zoofs is a Python library for performing feature selection using a variety of nature inspired wrapper algorithms. The algorithms range from swarm-intelligence to physics based to Evolutionary. It's an easy to use, flexible and powerful tool to reduce your feature size.

🌟 Like this Project? Give us a star !

📘 Documentation

https://jaswinder9051998.github.io/zoofs/

🔗 Whats new in V0.1.24

  • pass kwargs through objective function
  • improved logger for results
  • added harris hawk algorithm
  • now you can pass timeout as a parameter to stop operation after the given number of second(s). An amazing alternative to passing number of iterations
  • Feature score hashing of visited feature sets to increase the overall performance

🛠 Installation

Using pip

Use the package manager to install zoofs.

pip install zoofs

📜 Available Algorithms

Algorithm Name Class Name Description References doi
Particle Swarm Algorithm ParticleSwarmOptimization Utilizes swarm behaviour https://doi.org/10.1007/978-3-319-13563-2_51
Grey Wolf Algorithm GreyWolfOptimization Utilizes wolf hunting behaviour https://doi.org/10.1016/j.neucom.2015.06.083
Dragon Fly Algorithm DragonFlyOptimization Utilizes dragonfly swarm behaviour https://doi.org/10.1016/j.knosys.2020.106131
Harris Hawk Algorithm HarrisHawkOptimization Utilizes hawk hunting behaviour https://link.springer.com/chapter/10.1007/978-981-32-9990-0_12
Genetic Algorithm Algorithm GeneticOptimization Utilizes genetic mutation behaviour https://doi.org/10.1109/ICDAR.2001.953980
Gravitational Algorithm GravitationalOptimization Utilizes newtons gravitational behaviour https://doi.org/10.1109/ICASSP.2011.5946916

More algos soon, stay tuned !

  • [Try It Now?] Open In Colab

⚡️ Usage

Define your own objective function for optimization !

Classification Example

from sklearn.metrics import log_loss
# define your own objective function, make sure the function receives four parameters,
#  fit your model and return the objective value !
def objective_function_topass(model,X_train, y_train, X_valid, y_valid):      
    model.fit(X_train,y_train)  
    P=log_loss(y_valid,model.predict_proba(X_valid))
    return P

# import an algorithm !  
from zoofs import ParticleSwarmOptimization
# create object of algorithm
algo_object=ParticleSwarmOptimization(objective_function_topass,n_iteration=20,
                                       population_size=20,minimize=True)
import lightgbm as lgb
lgb_model = lgb.LGBMClassifier()                                       
# fit the algorithm
algo_object.fit(lgb_model,X_train, y_train, X_valid, y_valid,verbose=True)
#plot your results
algo_object.plot_history()

Regression Example

from sklearn.metrics import mean_squared_error
# define your own objective function, make sure the function receives four parameters,
#  fit your model and return the objective value !
def objective_function_topass(model,X_train, y_train, X_valid, y_valid):      
    model.fit(X_train,y_train)  
    P=mean_squared_error(y_valid,model.predict(X_valid))
    return P

# import an algorithm !  
from zoofs import ParticleSwarmOptimization
# create object of algorithm
algo_object=ParticleSwarmOptimization(objective_function_topass,n_iteration=20,
                                       population_size=20,minimize=True)
import lightgbm as lgb
lgb_model = lgb.LGBMRegressor()                                       
# fit the algorithm
algo_object.fit(lgb_model,X_train, y_train, X_valid, y_valid,verbose=True)
#plot your results
algo_object.plot_history()

Suggestions for Usage

  • As available algorithms are wrapper algos, it is better to use ml models that build quicker, e.g lightgbm, catboost.
  • Take sufficient amount for 'population_size' , as this will determine the extent of exploration and exploitation of the algo.
  • Ensure that your ml model has its hyperparamters optimized before passing it to zoofs algos.

objective score plot

objective score Header



Algorithms

Particle Swarm Algorithm

Particle Swarm

In computational science, particle swarm optimization (PSO) is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. It solves a problem by having a population of candidate solutions, here dubbed particles, and moving these particles around in the search-space according to simple mathematical formula over the particle's position and velocity. Each particle's movement is influenced by its local best known position, but is also guided toward the best known positions in the search-space, which are updated as better positions are found by other particles. This is expected to move the swarm toward the best solutions.


class zoofs.ParticleSwarmOptimization(objective_function,n_iteration=50,population_size=50,minimize=True,c1=2,c2=2,w=0.9)


Parameters objective_function : user made function of the signature 'func(model,X_train,y_train,X_test,y_test)'.
The function must return a value, that needs to be minimized/maximized.
n_iteration : int, default=1000
Number of time the algorithm will run
timeout: int = None
Stop operation after the given number of second(s). If this argument is set to None, the operation is executed without time limitation and n_iteration is followed
population_size : int, default=50
Total size of the population
minimize : bool, default=True
Defines if the objective value is to be maximized or minimized
c1 : float, default=2.0
first acceleration coefficient of particle swarm
c2 : float, default=2.0
second acceleration coefficient of particle swarm
w : float, default=0.9
weight parameter
Attributes best_feature_list : array-like
Final best set of features

Methods

Methods Class Name
fit Run the algorithm
plot_history Plot results achieved across iteration

fit(model,X_train, y_train, X_test, y_test,verbose=True)

Parameters model :
machine learning model's object
X_train : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Training input samples to be used for machine learning model
y_train : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The target values (class labels in classification, real numbers in regression).
X_valid : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Validation input samples
y_valid : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The Validation target values .
verbose : bool,default=True
Print results for iterations
Returns best_feature_list : array-like
Final best set of features

plot_history()

Plot results across iterations

Example

from sklearn.metrics import log_loss
# define your own objective function, make sure the function receives four parameters,
#  fit your model and return the objective value !
def objective_function_topass(model,X_train, y_train, X_valid, y_valid):      
    model.fit(X_train,y_train)  
    P=log_loss(y_valid,model.predict_proba(X_valid))
    return P

# import an algorithm !  
from zoofs import ParticleSwarmOptimization
# create object of algorithm
algo_object=ParticleSwarmOptimization(objective_function_topass,n_iteration=20,
                                       population_size=20,minimize=True,c1=2,c2=2,w=0.9)
import lightgbm as lgb
lgb_model = lgb.LGBMClassifier()                      
# fit the algorithm
algo_object.fit(lgb_model,X_train, y_train, X_valid, y_valid,verbose=True)
#plot your results
algo_object.plot_history()


Grey Wolf Algorithm

Grey Wolf

The Grey Wolf Optimizer (GWO) mimics the leadership hierarchy and hunting mechanism of grey wolves in nature. Four types of grey wolves such as alpha, beta, delta, and omega are employed for simulating the leadership hierarchy. In addition, three main steps of hunting, searching for prey, encircling prey, and attacking prey, are implemented to perform optimization.


class zoofs.GreyWolfOptimization(objective_function,n_iteration=50,population_size=50,minimize=True)


Parameters objective_function : user made function of the signature 'func(model,X_train,y_train,X_test,y_test)'.
The function must return a value, that needs to be minimized/maximized.
n_iteration : int, default=50
Number of time the algorithm will run
timeout: int = None
Stop operation after the given number of second(s). If this argument is set to None, the operation is executed without time limitation and n_iteration is followed
population_size : int, default=50
Total size of the population
method : {1, 2}, default=1
Choose the between the two methods of grey wolf optimization
minimize : bool, default=True
Defines if the objective value is to be maximized or minimized
Attributes best_feature_list : array-like
Final best set of features

Methods

Methods Class Name
fit Run the algorithm
plot_history Plot results achieved across iteration

fit(model,X_train,y_train,X_valid,y_valid,method=1,verbose=True)

Parameters model :
machine learning model's object
X_train : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Training input samples to be used for machine learning model
y_train : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The target values (class labels in classification, real numbers in regression).
X_valid : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Validation input samples
y_valid : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The Validation target values .
verbose : bool,default=True
Print results for iterations
Returns best_feature_list : array-like
Final best set of features

plot_history()

Plot results across iterations

Example

from sklearn.metrics import log_loss
# define your own objective function, make sure the function receives four parameters,
#  fit your model and return the objective value !
def objective_function_topass(model,X_train, y_train, X_valid, y_valid):      
    model.fit(X_train,y_train)  
    P=log_loss(y_valid,model.predict_proba(X_valid))
    return P

# import an algorithm !  
from zoofs import GreyWolfOptimization
# create object of algorithm
algo_object=GreyWolfOptimization(objective_function_topass,n_iteration=20,method=1,
                                    population_size=20,minimize=True)
import lightgbm as lgb
lgb_model = lgb.LGBMClassifier()                                       
# fit the algorithm
algo_object.fit(lgb_model,X_train, y_train, X_valid, y_valid,verbose=True)
#plot your results
algo_object.plot_history()


Dragon Fly Algorithm

Dragon Fly

The main inspiration of the Dragonfly Algorithm (DA) algorithm originates from static and dynamic swarming behaviours. These two swarming behaviours are very similar to the two main phases of optimization using meta-heuristics: exploration and exploitation. Dragonflies create sub swarms and fly over different areas in a static swarm, which is the main objective of the exploration phase. In the static swarm, however, dragonflies fly in bigger swarms and along one direction, which is favourable in the exploitation phase.


class zoofs.DragonFlyOptimization(objective_function,n_iteration=50,population_size=50,minimize=True)


Parameters objective_function : user made function of the signature 'func(model,X_train,y_train,X_test,y_test)'.
The function must return a value, that needs to be minimized/maximized.
n_iteration : int, default=50
Number of time the algorithm will run
timeout: int = None
Stop operation after the given number of second(s). If this argument is set to None, the operation is executed without time limitation and n_iteration is followed
population_size : int, default=50
Total size of the population
method : {'linear','random','quadraic','sinusoidal'}, default='sinusoidal'
Choose the between the three methods of Dragon Fly optimization
minimize : bool, default=True
Defines if the objective value is to be maximized or minimized
Attributes best_feature_list : array-like
Final best set of features

Methods

Methods Class Name
fit Run the algorithm
plot_history Plot results achieved across iteration

fit(model,X_train,y_train,X_valid,y_valid,method='sinusoidal',verbose=True)

Parameters model :
machine learning model's object
X_train : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Training input samples to be used for machine learning model
y_train : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The target values (class labels in classification, real numbers in regression).
X_valid : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Validation input samples
y_valid : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The Validation target values .
verbose : bool,default=True
Print results for iterations
Returns best_feature_list : array-like
Final best set of features

plot_history()

Plot results across iterations

Example

from sklearn.metrics import log_loss
# define your own objective function, make sure the function receives four parameters,
#  fit your model and return the objective value !
def objective_function_topass(model,X_train, y_train, X_valid, y_valid):      
    model.fit(X_train,y_train)  
    P=log_loss(y_valid,model.predict_proba(X_valid))
    return P

# import an algorithm !  
from zoofs import DragonFlyOptimization
# create object of algorithm
algo_object=DragonFlyOptimization(objective_function_topass,n_iteration=20,method='sinusoidal',
                                    population_size=20,minimize=True)
import lightgbm as lgb
lgb_model = lgb.LGBMClassifier()                                     
# fit the algorithm
algo_object.fit(lgb_model,X_train, y_train, X_valid, y_valid,  verbose=True)
#plot your results
algo_object.plot_history()


Harris Hawk Optimization

Harris Hawk

HHO is a popular swarm-based, gradient-free optimization algorithm with several active and time-varying phases of exploration and exploitation. This algorithm initially published by the prestigious Journal of Future Generation Computer Systems (FGCS) in 2019, and from the first day, it has gained increasing attention among researchers due to its flexible structure, high performance, and high-quality results. The main logic of the HHO method is designed based on the cooperative behaviour and chasing styles of Harris' hawks in nature called "surprise pounce". Currently, there are many suggestions about how to enhance the functionality of HHO, and there are also several enhanced variants of the HHO in the leading Elsevier and IEEE transaction journals.


class zoofs.HarrisHawkOptimization(objective_function,n_iteration=50,population_size=50,minimize=True,beta=0.5)


Parameters objective_function : user made function of the signature 'func(model,X_train,y_train,X_test,y_test)'.
The function must return a value, that needs to be minimized/maximized.
n_iteration : int, default=1000
Number of time the algorithm will run
timeout: int = None
Stop operation after the given number of second(s). If this argument is set to None, the operation is executed without time limitation and n_iteration is followed
population_size : int, default=50
Total size of the population
minimize : bool, default=True
Defines if the objective value is to be maximized or minimized
beta : float, default=0.5
value for levy random walk
Attributes best_feature_list : array-like
Final best set of features

Methods

Methods Class Name
fit Run the algorithm
plot_history Plot results achieved across iteration

fit(model,X_train, y_train, X_test, y_test,verbose=True)

Parameters model :
machine learning model's object
X_train : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Training input samples to be used for machine learning model
y_train : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The target values (class labels in classification, real numbers in regression).
X_valid : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Validation input samples
y_valid : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The Validation target values .
verbose : bool,default=True
Print results for iterations
Returns best_feature_list : array-like
Final best set of features

plot_history()

Plot results across iterations

Example

from sklearn.metrics import log_loss
# define your own objective function, make sure the function receives four parameters,
#  fit your model and return the objective value !
def objective_function_topass(model,X_train, y_train, X_valid, y_valid):      
    model.fit(X_train,y_train)  
    P=log_loss(y_valid,model.predict_proba(X_valid))
    return P

# import an algorithm !  
from zoofs import HarrisHawkOptimization
# create object of algorithm
algo_object=HarrisHawkOptimization(objective_function_topass,n_iteration=20,
                                       population_size=20,minimize=True)
import lightgbm as lgb
lgb_model = lgb.LGBMClassifier()                      
# fit the algorithm
algo_object.fit(lgb_model,X_train, y_train, X_valid, y_valid,verbose=True)
#plot your results
algo_object.plot_history()


Genetic Algorithm

Dragon Fly

In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection. Some examples of GA applications include optimizing decision trees for better performance, automatically solve sudoku puzzles, hyperparameter optimization, etc.


class zoofs.GeneticOptimization(objective_function,n_iteration=20,population_size=20,selective_pressure=2,elitism=2,mutation_rate=0.05,minimize=True)


Parameters objective_function : user made function of the signature 'func(model,X_train,y_train,X_test,y_test)'.
The function must return a value, that needs to be minimized/maximized.
n_iteration: int, default=50
Number of time the algorithm will run
timeout: int = None
Stop operation after the given number of second(s). If this argument is set to None, the operation is executed without time limitation and n_iteration is followed
population_size : int, default=50
Total size of the population
selective_pressure: int, default=2
measure of reproductive opportunities for each organism in the population
elitism: int, default=2
number of top individuals to be considered as elites
mutation_rate: float, default=0.05
rate of mutation in the population's gene
minimize: bool, default=True
Defines if the objective value is to be maximized or minimized
Attributes best_feature_list : array-like
Final best set of features

Methods

Methods Class Name
fit Run the algorithm
plot_history Plot results achieved across iteration

fit(model,X_train,y_train,X_valid,y_valid,verbose=True)

Parameters model :
machine learning model's object
X_train : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Training input samples to be used for machine learning model
y_train : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The target values (class labels in classification, real numbers in regression).
X_valid : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Validation input samples
y_valid : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The Validation target values .
verbose : bool,default=True
Print results for iterations
Returns best_feature_list : array-like
Final best set of features

plot_history()

Plot results across iterations

Example

from sklearn.metrics import log_loss
# define your own objective function, make sure the function receives four parameters,
#  fit your model and return the objective value !
def objective_function_topass(model,X_train, y_train, X_valid, y_valid):      
    model.fit(X_train,y_train)  
    P=log_loss(y_valid,model.predict_proba(X_valid))
    return P

# import an algorithm !  
from zoofs import GeneticOptimization
# create object of algorithm
algo_object=GeneticOptimization(objective_function_topass,n_iteration=20,
                            population_size=20,selective_pressure=2,elitism=2,
                            mutation_rate=0.05,minimize=True)
import lightgbm as lgb
lgb_model = lgb.LGBMClassifier()                            
# fit the algorithm
algo_object.fit(lgb_model,X_train, y_train,X_valid, y_valid, verbose=True)
#plot your results
algo_object.plot_history()
Gravitational Algorithm

Gravitational Algorithm

Gravitational Algorithm is based on the law of gravity and mass interactions is introduced. In the algorithm, the searcher agents are a collection of masses which interact with each other based on the Newtonian gravity and the laws of motion.


class zoofs.GravitationalOptimization(self,objective_function,n_iteration=50,population_size=50,g0=100,eps=0.5,minimize=True)


Parameters objective_function : user made function of the signature 'func(model,X_train,y_train,X_test,y_test)'.
The function must return a value, that needs to be minimized/maximized.
n_iteration: int, default=50
Number of time the algorithm will run
timeout: int = None
Stop operation after the given number of second(s). If this argument is set to None, the operation is executed without time limitation and n_iteration is followed
population_size : int, default=50
Total size of the population
g0: float, default=100
gravitational strength constant
eps: float, default=0.5
distance constant
minimize: bool, default=True
Defines if the objective value is to be maximized or minimized
Attributes best_feature_list : array-like
Final best set of features

Methods

Methods Class Name
fit Run the algorithm
plot_history Plot results achieved across iteration

fit(model,X_train,y_train,X_valid,y_valid,verbose=True)

Parameters model :
machine learning model's object
X_train : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Training input samples to be used for machine learning model
y_train : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The target values (class labels in classification, real numbers in regression).
X_valid : pandas.core.frame.DataFrame of shape (n_samples, n_features)
Validation input samples
y_valid : pandas.core.frame.DataFrame or pandas.core.series.Series of shape (n_samples)
The Validation target values .
verbose : bool,default=True
Print results for iterations
Returns best_feature_list : array-like
Final best set of features

plot_history()

Plot results across iterations

Example

from sklearn.metrics import log_loss
# define your own objective function, make sure the function receives four parameters,
#  fit your model and return the objective value !
def objective_function_topass(model,X_train, y_train, X_valid, y_valid):      
    model.fit(X_train,y_train)  
    P=log_loss(y_valid,model.predict_proba(X_valid))
    return P

# import an algorithm !  
from zoofs import GravitationalOptimization
# create object of algorithm
algo_object=GravitationalOptimization(objective_function_topass,n_iteration=50,
                                population_size=50,g0=100,eps=0.5,minimize=True)
import lightgbm as lgb
lgb_model = lgb.LGBMClassifier()                                
# fit the algorithm
algo_object.fit(lgb_model,X_train, y_train, X_valid, y_valid, verbose=True)
#plot your results
algo_object.plot_history()

Support zoofs

The development of zoofs relies completely on contributions.

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

First roll out

18,08,2021

License

apache-2.0

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].