AnalyticalEngine.jl

[draft] Agnostic Machine Learning models working on CPUs, GPUs, distributed architecture, etc.

OTHER License

Stars
6

[DEPRECATED] AnalyticalEngine

/!\ This package is deprecated in favour of MLJ.

Who's behind this

  • Thibaut Lienart (Imperial College London)
  • Miguel Morin (Alan Turing Institute)
  • Sebastian Vollmer (University of Warwick, Alan Turing Institute)
  • Franz Kiraly (University College London)
  • Mike Innes (Julia Computing)
  • Avik Sengupta (Julia Computing)
  • Valentin Churavy (Massachusetts Institute of Technology)

Aims and Milestones

Milestones

  • March 2018
    • [working prototype] have a basic GeneralizedLinearRegression that works well and showcases the ideas + works with Flux
    • [WIP] have a basic pipeline JuliaDB -> AnalyticalEngine
    • have an interface with DecisionTree.jl
    • [WIP] have a way to deal with hyperparameters that works well with meta-learning
  • August 2018
    • have a full pipeline JuliaDB -> FeatEng -> AnalyticalEngine
    • have a working framework for metalearning
    • have working tools for hyperparameter tuning (BayesianOpt, K-Folds, ...)
  • Longer term
    • In the spirit of MLR we'd like to interface with as many dedicated packages ("solvers") as possible and promote the creation and maintenance of those.
      • In a first phase we won't care too much about this, focusing on the general pipeline, hyperparameter management etc but eventually this will become the key focus once we have a strong central API.
      • There are a ton of packages implementing / re-implementing specific capabilities, hopefully the API will lead to the merging / concentration of packages solving generic tasks efficiently

Aims

  • Major aims: offer a modern SkLearn-style package that can:
    • work efficiently with large databases (via JuliaDB)
    • work efficiently with different compute infrastructure (parallel, GPU, ...)
    • work with generic optimisation algorithms (via Optim.jl)
    • work with auto-diff algorithms (via Flux.jl)
    • offer extensible meta-learning
    • offer modern and extensible hyperparameter tuning (such as Bayesian opt)
    • be extended easily by researchers/users in such a way that the maths matches well with the code

Inspiration