A C++ standalone library for machine learning
Flashlight is a fast, flexible machine learning library written entirely in C++
from the Facebook AI Research and the creators of Torch, TensorFlow, Eigen and
Deep Speech. Its core features include:
Native support in C++ and simple extensibility makes Flashlight a powerful research framework that enables fast iteration on new experimental setups and algorithms with little unopinionation and without sacrificing performance. In a single repository, Flashlight provides apps for research across multiple domains:
Flashlight is broken down into a few parts:
flashlight/lib
contains kernels and standalone utilities for audio processing and more.flashlight/fl
is the core tensor interface and neural network library using the ArrayFire tensor library by default.flashlight/pkg
are domain packages for speech, vision, and text built on the core.flashlight/app
are applications of the core library to machine learning across domains.First, build and install Flashlight and link it to your own project.
Sequential
forms a sequence of Flashlight Module
s for chaining computation.
#include <flashlight/fl/flashlight.h>
Sequential model;
model.add(View(fl::Shape({IM_DIM, IM_DIM, 1, -1})));
model.add(Conv2D(
1 /* input channels */,
32 /* output channels */,
5 /* kernel width */,
5 /* kernel height */,
1 /* stride x */,
1 /* stride y */,
PaddingMode::SAME; /* padding mode */,
PaddingMode::SAME; /* padding mode */));
model.add(ReLU());
model.add(Pool2D(
2 /* kernel width */,
2 /* kernel height */,
2 /* stride x */,
2 /* stride y */));
model.add(Conv2D(32, 64, 5, 5, 1, 1, PaddingMode::SAME, PaddingMode::SAME));
model.add(ReLU());
model.add(Pool2D(2, 2, 2, 2));
model.add(View(fl::Shape({7 * 7 * 64, -1})));
model.add(Linear(7 * 7 * 64, 1024));
model.add(ReLU());
model.add(Dropout(0.5));
model.add(Linear(1024, 10));
model.add(LogSoftmax());
Performing forward and backward computation is straightforwards:
auto output = model.forward(input);
auto loss = categoricalCrossEntropy(output, target);
loss.backward();
See the MNIST example for a full tutorial including a training loop and dataset abstractions.
Variable
is a tape-based abstraction that wraps Flashlight tensors. Tape-based Automatic differentiation in Flashlight is simple and works as you’d expect.
auto A = Variable(fl::rand({1000, 1000}), true /* calcGrad */);
auto B = 2.0 * A;
auto C = 1.0 + B;
auto D = log(C);
D.backward(); // populates A.grad() along with gradients for B, C, and D.
Install with vcpkg
| With Docker | From Source | From Source with vcpkg
| Build Your Project with Flashlight
At minimum, compilation requires:
make
See the full dependency list for more details if building from source.
Instructions for building/installing Python bindings can be found here.
Flashlight can be broken down into several components as described above. Each component can be incrementally built by specifying the correct build options.
There are two ways to work with Flashlight:
Flashlight can be built in one of two ways:
vcpkg
, a C++ package manager.vcpkg
vcpkg
Flashlight is most-easily built and installed with vcpkg
. Both the CUDA and CPU backends are supported with vcpkg
. For either backend, first install Intel MKL. For the CUDA backend, install CUDA
>= 9.2, cuDNN
, and NCCL
. Then, after installing vcpkg
, install the libraries and core with:
./vcpkg/vcpkg install flashlight-cuda # CUDA backend, OR
./vcpkg/vcpkg install flashlight-cpu # CPU backend
To install Flashlight apps, check the features available for installation by running ./vcpkg search flashlight-cuda
or ./vcpkg search flashlight-cpu
. Each app is a “feature”: for example, ./vcpkg install flashlight-cuda[asr]
installs the ASR app with the CUDA backend.
Below is the currently-supported list of features (for each of flashlight-cuda
and flashlight-cpu
):
flashlight-{cuda/cpu}[lib] # Flashlight libraries
flashlight-{cuda/cpu}[nn] # Flashlight neural net library
flashlight-{cuda/cpu}[asr] # Flashlight speech recognition app
flashlight-{cuda/cpu}[lm] # Flashlight language modeling app
flashlight-{cuda/cpu}[imgclass] # Flashlight image classification app
Flashlight app binaries are also built for the selected features and are installed into the vcpkg
install tree’s tools
directory.
Integrating Flashlight into your own project with is simple using vcpkg
’s CMake toolchain integration.
vcpkg
First, install the dependencies for your backend of choice using vcpkg
(click to expand the below):
To build the Flashlight CUDA backend from source using dependencies installed with vcpkg
, install CUDA
>= 9.2, cuDNN
, NCCL
, and Intel MKL, then build the rest of the dependencies for the CUDA backend based on which Flashlight features you’d like to build:
./vcpkg install \
cuda intel-mkl fftw3 cub kenlm \ # if building flashlight libraries
arrayfire[cuda] cudnn nccl openmpi cereal stb \ # if building the flashlight neural net library
gflags glog \ # if building any flashlight apps
libsndfile \ # if building the flashlight asr app
gtest # optional, if building tests
To build the Flashlight CPU backend from source using dependencies installed with vcpkg
, install Intel MKL, then build the rest of the dependencies for the CPU backend based on which Flashlight features you’d like to build:
./vcpkg install \
intel-mkl fftw3 kenlm \ # for flashlight libraries
arrayfire[cpu] gloo[mpi] openmpi onednn cereal stb \ # for the flashlight neural net library
gflags glog \ # for the flashlight runtime pkg (any flashlight apps using it)
libsndfile \ # for the flashlight speech pkg
gtest # optional, for tests
vcpkg
Toolchain FileTo build Flashlight from source with these dependencies, clone the repository:
git clone https://github.com/flashlight/flashlight.git && cd flashlight
mkdir -p build && cd build
Then, build from source using vcpkg
’s CMake toolchain:
cmake .. \
-DCMAKE_BUILD_TYPE=Release \
-DFL_BUILD_ARRAYFIRE=ON \
-DCMAKE_TOOLCHAIN_FILE=[path to your vcpkg clone]/scripts/buildsystems/vcpkg.cmake
make -j$(nproc)
make install -j$(nproc) # only if you want to install Flashlight for external use
To build a subset of Flashlight’s features, see the build options below.
To build from source, first install the below dependencies. Most are available with your system’s local package manager.
Some dependencies marked below are downloaded and installed automatically if not found on the local system. FL_BUILD_STANDALONE
determines this behavior — if disabled, dependencies won’t be downloaded and built when building Flashlight.
Once all dependencies are installed, clone the repository:
git clone https://github.com/flashlight/flashlight.git && cd flashlight
mkdir -p build && cd build
Then build all Flashlight components with:
cmake .. -DCMAKE_BUILD_TYPE=Release -DFL_BUILD_ARRAYFIRE=ON [...build options]
make -j$(nproc)
make install
Setting the MKLROOT
environment variable (export MKLROOT=/opt/intel/oneapi/mkl/latest
or export MKLROOT=/opt/intel/mkl
on most Linux-based systems) can help CMake find Intel MKL if not initially found.
To build a smaller subset of Flashlight features/apps, see the build options below for a complete list of options.
To install Flashlight in a custom directory, use CMake’s CMAKE_INSTALL_PREFIX
argument. Flashlight libraries can be built as shared libraries using CMake’s BUILD_SHARED_LIBS
argument.
Flashlight uses modern CMake and IMPORTED
targets for most dependencies. If a dependency isn’t found, passing -D<package>_DIR
to your cmake
command or exporting <package>_DIR
as an environment variable equal to the path to <package>Config.cmake
can help locate dependencies on your system. See the documentation for more details. If CMake is failing to locate a package, check to see if a corresponding issue has already been created before creating your own.
On MacOS, ArrayFire can be installed with homebrew and the Flashlight core can be built as follows:
brew install arrayfire
cmake .. \
-DFL_ARRAYFIRE_USE_OPENCL=ON \
-DFL_USE_ONEDNN=OFF \
-DFL_BUILD_TESTS=OFF \
-DFL_BUILD_EXAMPLES=OFF \
-DFL_BUILD_SCRIPTS=OFF \
-DFL_BUILD_DISTRIBUTED=OFF
make -j$(nproc)
Dependencies marked with *
are automatically downloaded and built from source if not found on the system. Setting FL_BUILD_STANDALONE
to OFF
disables this behavior.
Dependencies marked with ^
are required if building with distributed training enabled (FL_BUILD_DISTRIBUTED
— see the build options below). Distributed training is required for all apps.
Dependencies marked with †
are installable via vcpkg
. See the instructions for installing those dependencies above for doing a Flashlight from-source build.
Component | Backend | Dependencies |
---|---|---|
libraries | CUDA | CUDA >= 9.2, CUB*† (if CUDA < 11) |
CPU | A BLAS library (Intel MKL >= 2018, OpenBLAS†, etc) | |
core | Any | ArrayFire >= 3.7.3†, an MPI library^(OpenMPI†, etc), cereal*† >= 1.3.0, stb*† |
CUDA | CUDA >= 9.2, NCCL^, cuDNN | |
CPU | oneDNN† >= 2.5.2, gloo (with MPI)*^† | |
app: all | Any | Google Glog†, Gflags† |
app: asr | Any | libsndfile*† >= 10.0.28, a BLAS library (Intel MKL >= 2018, OpenBLAS†, etc), flashlight/text* |
app: imgclass | Any | - |
app: imgclass | Any | - |
app: lm | Any | flashlight/text* |
tests | Any | Google Test (gtest, with gmock)*† >= 1.10.0 |
The Flashlight CMake build accepts the following build options (prefixed with -D
when running CMake from the command line):
Name | Options | Default Value | Description |
---|---|---|---|
FL_BUILD_ARRAYFIRE | ON, OFF | ON | Build Flashlight with the ArrayFire backend. |
ON, OFF | ON | Downloads/builds some dependencies if not found. | |
FL_BUILD_LIBRARIES | ON, OFF | ON | Build the Flashlight libraries. |
ON, OFF | ON | Build the Flashlight neural net library. | |
ON, OFF | ON | Build with distributed training; required for apps. | |
FL_BUILD_CONTRIB | ON, OFF | ON | Build contrib APIs subject to breaking changes. |
FL_BUILD_APPS | ON, OFF | ON | Build applications (see below). |
FL_BUILD_APP_ASR | ON, OFF | ON | Build the automatic speech recognition application. |
FL_BUILD_APP_IMGCLASS | ON, OFF | ON | Build the image classification application. |
FL_BUILD_APP_LM | ON, OFF | ON | Build the language modeling application. |
FL_BUILD_APP_ASR_TOOLS | ON, OFF | ON | Build automatic speech recognition app tools. |
FL_BUILD_TESTS | ON, OFF | ON | Build tests. |
FL_BUILD_EXAMPLES | ON, OFF | ON | Build examples. |
FL_BUILD_EXPERIMENTAL | ON, OFF | OFF | Build experimental components. |
CMAKE_BUILD_TYPE | See docs. | Debug | See the CMake documentation. |
CMAKE_INSTALL_PREFIX | [Directory] | See docs. | See the CMake documentation. |
Flashlight is most-easily linked to using CMake. Flashlight exports the following CMake targets when installed:
flashlight::flashlight
— contains flashlight libraries as well as the flashlight core autograd and neural network library.flashlight::fl_pkg_runtime
— contains flashlight core as well as common utilities for training (logging / flags / distributed utils).flashlight::fl_pkg_vision
— contains flashlight core as well as common utilities for vision pipelines.flashlight::fl_pkg_text
— contains flashlight core as well as common utilities for dealing with text data.flashlight::fl_pkg_speech
— contains flashlight core as well as common utilities for dealing with speech data.flashlight::fl_pkg_halide
— contains flashlight core and extentions to easily interface with halide.Given a simple project.cpp
file that includes and links to Flashlight:
#include <iostream>
#include <flashlight/fl/flashlight.h>
int main() {
fl::init();
fl::Variable v(fl::full({1}, 1.), true);
auto result = v + 10;
std::cout << "Tensor value is " << result.tensor() << std::endl; // 11.000
return 0;
}
The following CMake configuration links Flashlight and sets include directories:
cmake_minimum_required(VERSION 3.10)
set(CMAKE_CXX_STANDARD 17)
set(CMAKE_CXX_STANDARD_REQUIRED ON)
add_executable(myProject project.cpp)
find_package(flashlight CONFIG REQUIRED)
target_link_libraries(myProject PRIVATE flashlight::flashlight)
vcpkg
Flashlight InstallationIf you installed Flashlight with vcpkg
, the above CMake configuration for myProject
can be built by running:
cd project && mkdir build && cd build
cmake .. \
-DCMAKE_TOOLCHAIN_FILE=[path to vcpkg clone]/scripts/buildsystems/vcpkg.cmake \
-DCMAKE_BUILD_TYPE=Release
make -j$(nproc)
If using a from-source installation of Flashlight, Flashlight will be found automatically by CMake:
cd project && mkdir build && cd build
cmake .. -DCMAKE_BUILD_TYPE=Release
make -j$(nproc)
If Flashlight is installed in a custom location using a CMAKE_INSTALL_PREFIX
, passing -Dflashlight_DIR=[install prefix]/share/flashlight/cmake
as an argument to your cmake
command can help CMake find Flashlight.
Flashlight and its dependencies can also be built with the provided Dockerfiles; see the accompanying Docker documentation for more information.
Contact: [email protected], [email protected], [email protected], [email protected], [email protected], [email protected],
[email protected], [email protected], [email protected], [email protected]
Flashlight is being very actively developed. See
CONTRIBUTING for more on how to help out.
Some of Flashlight’s code is derived from
arrayfire-ml.
You can cite Flashlight using:
@misc{kahn2022flashlight,
title={Flashlight: Enabling Innovation in Tools for Machine Learning},
author={Jacob Kahn and Vineel Pratap and Tatiana Likhomanenko and Qiantong Xu and Awni Hannun and Jeff Cai and Paden Tomasello and Ann Lee and Edouard Grave and Gilad Avidov and Benoit Steiner and Vitaliy Liptchinsky and Gabriel Synnaeve and Ronan Collobert},
year={2022},
eprint={2201.12465},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
Flashlight is under an MIT license. See LICENSE for more information.