# oneMKL
**Repository Path**: Li153452793/oneMKL
## Basic Information
- **Project Name**: oneMKL
- **Description**: No description available
- **Primary Language**: Unknown
- **License**: Apache-2.0
- **Default Branch**: develop
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 1
- **Created**: 2021-08-03
- **Last Updated**: 2022-07-06
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# oneAPI Math Kernel Library (oneMKL) Interfaces
oneMKL interfaces are an open-source implementation of the oneMKL Data Parallel C++ (DPC++) interface according to the [oneMKL specification](https://spec.oneapi.com/versions/latest/elements/oneMKL/source/index.html). It works with multiple devices (backends) using device-specific libraries underneath.
## Table of Contents
- [Support and Requirements](#support-and-requirements)
- [Build Setup](#build-setup)
- [Building with Conan](#building-with-conan)
- [Building with CMake](#building-with-cmake)
- [Project Cleanup](#project-cleanup)
- [FAQs](#faqs)
- [Legal Information](#legal-information)
---
## Support and Requirements
### Supported Usage Models:
There are two oneMKL selector layer implementations:
- **Run-time dispatching**: The application is linked with the oneMKL library and the required backend is loaded at run-time based on device vendor (all libraries should be dynamic).
Example of app.cpp with run-time dispatching:
```cpp
#include "oneapi/mkl.hpp"
...
cpu_dev = sycl::device(sycl::cpu_selector());
gpu_dev = sycl::device(sycl::gpu_selector());
sycl::queue cpu_queue(cpu_dev);
sycl::queue gpu_queue(gpu_dev);
oneapi::mkl::blas::column_major::gemm(cpu_queue, transA, transB, m, ...);
oneapi::mkl::blas::column_major::gemm(gpu_queue, transA, transB, m, ...);
```
How to build an application with run-time dispatching:
```cmd
$> clang++ -fsycl –I$ONEMKL/include app.cpp
$> clang++ -fsycl app.o –L$ONEMKL/lib –lonemkl
```
- **Compile-time dispatching**: The application uses a templated backend selector API where the template parameters specify the required backends and third-party libraries and the application is linked with the required oneMKL backend wrapper libraries (libraries can be static or dynamic).
Example of app.cpp with compile-time dispatching:
```cpp
#include "oneapi/mkl.hpp"
...
cpu_dev = sycl::device(sycl::cpu_selector());
gpu_dev = sycl::device(sycl::gpu_selector());
sycl::queue cpu_queue(cpu_dev);
sycl::queue gpu_queue(gpu_dev);
oneapi::mkl::backend_selector cpu_selector(cpu_queue);
oneapi::mkl::blas::column_major::gemm(cpu_selector, transA, transB, m, ...);
oneapi::mkl::blas::column_major::gemm(oneapi::mkl::backend_selector {gpu_queue}, transA, transB, m, ...);
```
How to build an application with compile-time dispatching:
```cmd
$> clang++ -fsycl –I$ONEMKL/include app.cpp
$> clang++ -fsycl app.o –L$ONEMKL/lib –lonemkl_blas_mklcpu –lonemkl_blas_cublas
```
### Supported Configurations:
Supported domains: BLAS, LAPACK, RNG
#### Linux*
| Domain |
Backend |
Library |
Supported Link Type |
| BLAS |
x86 CPU |
Intel(R) oneAPI Math Kernel Library |
Dynamic, Static |
| Intel GPU |
Dynamic, Static |
| NVIDIA GPU |
NVIDIA cuBLAS |
Dynamic, Static |
| x86 CPU |
NETLIB LAPACK |
Dynamic, Static |
| LAPACK |
x86 CPU |
Intel(R) oneAPI Math Kernel Library |
Dynamic, Static |
| Intel GPU |
Dynamic, Static |
| RNG |
x86 CPU |
Intel(R) oneAPI Math Kernel Library |
Dynamic, Static |
| Intel GPU |
Dynamic, Static |
| NVIDIA GPU |
NVIDIA cuRAND |
Dynamic, Static |
#### Windows*
| Domain |
Backend |
Library |
Supported Link Type |
| BLAS |
x86 CPU |
Intel(R) oneAPI Math Kernel Library |
Dynamic, Static |
| Intel GPU |
Dynamic, Static |
| x86 CPU |
NETLIB LAPACK |
Dynamic, Static |
| LAPACK |
x86 CPU |
Intel(R) oneAPI Math Kernel Library |
Dynamic, Static |
| Intel GPU |
Dynamic, Static |
| RNG |
x86 CPU |
Intel(R) oneAPI Math Kernel Library |
Dynamic, Static |
---
### Hardware Platform Support
- CPU
- Intel Atom(R) Processors
- Intel(R) Core(TM) Processor Family
- Intel(R) Xeon(R) Processor Family
- Accelerators
- Intel(R) Processor Graphics GEN9
- NVIDIA(R) TITAN RTX(TM) (Linux* only. cuRAND backend tested also with Quadro and A100 GPUs. Not tested with other NVIDIA GPU families and products.)
---
### Supported Operating Systems
#### Linux*
Operating System | CPU Host/Target | Integrated Graphics from Intel (Intel GPU) | NVIDIA GPU
:--- | :--- | :--- | :---
Ubuntu | 18.04.3, 19.04 | 18.04.3, 19.10 | 18.04.3, 20.04
SUSE Linux Enterprise Server* | 15 | *Not supported* | *Not supported*
Red Hat Enterprise Linux* (RHEL*) | 8 | *Not supported* | *Not supported*
Linux* kernel | *N/A* | 4.11 or higher | *N/A*
#### Windows*
Operating System | CPU Host/Target | Integrated Graphics from Intel (Intel GPU)
:--- | :--- | :---
Microsoft Windows* | 10 (64-bit version only) | 10 (64-bit version only)
Microsoft Windows* Server | 2016, 2019 | *Not supported*
---
### Software Requirements
**What should I download?**
#### General:
| Using Conan |
Using CMake Directly |
| |
Functional Testing |
Build Only |
Documentation |
Linux* : GNU* GCC 5.1 or higher Windows* : MSVS* 2017 or MSVS* 2019 (version 16.5 or newer) |
| Python 3.6 or higher |
CMake |
| Ninja (optional) |
| Conan C++ package manager |
GNU* FORTRAN Compiler |
- |
Sphinx |
| NETLIB LAPACK |
- |
- |
#### Hardware and OS Specific:
| Operating System |
Device |
Package |
Installed by Conan |
Linux*/Windows* |
x86 CPU |
Intel(R) oneAPI DPC++ Compiler or Intel project for LLVM* technology |
No |
| Intel(R) oneAPI Math Kernel Library |
Yes |
Intel GPU |
Intel(R) oneAPI DPC++ Compiler |
No |
| Intel GPU driver |
No |
| Intel(R) oneAPI Math Kernel Library |
Yes |
Linux* only |
NVIDIA GPU |
Intel project for LLVM* technology |
No |
*If [Building with Conan](#building-with-conan), above packages marked as "No" must be installed manually.*
*If [Building with CMake](#building-with-cmake), above packages must be installed manually.*
#### Notice for Use of Conan Package Manager
**LEGAL NOTICE: By downloading and using this container or script as applicable (the “Software Package”) and the included software or software made available for download, you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third party software (together, the “Agreements”) included in this README file.**
**If the Software Package is installed through a silent install, your download and use of the
Software Package indicates your acceptance of the Agreements.**
#### Product and Version Information:
Product | Supported Version | Installed by Conan | Conan Package Source | Package Install Location on Linux* | License
:--- | :--- | :--- | :--- | :--- | :---
Python | 3.6 or higher | No | *N/A* | *Pre-installed or Installed by user* | [PSF](https://docs.python.org/3.6/license.html)
[Conan C++ Package Manager](https://conan.io/downloads.html) | 1.24 or higher | No | *N/A* | *Installed by user* | [MIT](https://github.com/conan-io/conan/blob/develop/LICENSE.md)
[CMake](https://cmake.org/download/) | 3.13 or higher | Yes
(3.15 or higher) | conan-center | ~/.conan/data or $CONAN_USER_HOME/.conan/data | [The OSI-approved BSD 3-clause License](https://gitlab.kitware.com/cmake/cmake/raw/master/Copyright.txt)
[Ninja](https://ninja-build.org/) | 1.10.0 | Yes | conan-center | ~/.conan/data or $CONAN_USER_HOME/.conan/data | [Apache License v2.0](https://github.com/ninja-build/ninja/blob/master/COPYING)
[GNU* FORTRAN Compiler](https://gcc.gnu.org/wiki/GFortran) | 7.4.0 or higher | Yes | apt | /usr/bin | [GNU General Public License, version 3](https://gcc.gnu.org/onlinedocs/gcc-7.5.0/gfortran/Copying.html)
[Intel(R) oneAPI DPC++ Compiler](https://software.intel.com/en-us/oneapi/dpc-compiler) | latest | No | *N/A* | *Installed by user* | [End User License Agreement for the Intel(R) Software Development Products](https://software.intel.com/en-us/license/eula-for-intel-software-development-products)
[Intel project for LLVM* technology binary for x86 CPU](https://github.com/intel/llvm/releases) | Daily builds (experimental) tested with [20200331](https://github.com/intel/llvm/releases/download/20200331/dpcpp-compiler.tar.gz) | No | *N/A* | *Installed by user* | [Apache License v2](https://github.com/intel/llvm/blob/sycl/sycl/LICENSE.TXT)
[Intel project for LLVM* technology source for NVIDIA GPU](https://github.com/intel/llvm/releases) | Daily source releases: tested with [20200421](https://github.com/intel/llvm/tree/20200421) | No | *N/A* | *Installed by user* | [Apache License v2](https://github.com/intel/llvm/blob/sycl/sycl/LICENSE.TXT)
[Intel(R) oneAPI Math Kernel Library](https://software.intel.com/en-us/oneapi/onemkl) | latest | Yes | apt | /opt/intel/inteloneapi/mkl | [Intel Simplified Software License](https://software.intel.com/en-us/license/intel-simplified-software-license)
[NVIDIA CUDA SDK](https://developer.nvidia.com/cublas) | 10.2 | No | *N/A* | *Installed by user* |[End User License Agreement](https://docs.nvidia.com/cuda/eula/index.html)
[NETLIB LAPACK](https://www.netlib.org/) | 3.7.1 | Yes | conan-community | ~/.conan/data or $CONAN_USER_HOME/.conan/data | [BSD like license](http://www.netlib.org/lapack/LICENSE.txt)
[Sphinx](https://www.sphinx-doc.org/en/master/) | 2.4.4 | Yes | pip | ~/.local/bin (or similar user local directory) | [BSD License](https://github.com/sphinx-doc/sphinx/blob/3.x/LICENSE)
*conan-center: https://api.bintray.com/conan/conan/conan-center*
*conan-community: https://api.bintray.com/conan/conan-community/conan*
---
## Build Setup
1. Install Intel(R) oneAPI DPC++ Compiler (select variant as per requirement).
2. Clone this project to ``, where `` is the root directory of this repository.
3. You can [Build with Conan](#building-with-conan) to automate the process of getting dependencies or you can download and install the required dependencies manually and [Build with CMake](#building-with-cmake) directly.
*Note: Conan package manager automates the process of getting required packages, so that you do not have to go to different web location and follow different instructions to install them.*
---
## Building with Conan
** This method currently works on Linux* only **
** Make sure you have completed [Build Setup](#build-setup). **
*Note: To understand how dependencies are resolved, refer to the [Product and Version Information](#product-and-version-information) section. For details about Conan package manager, refer to [Conan Documentation](https://docs.conan.io/en/latest/).*
### Getting Conan
Conan can be [installed](https://docs.conan.io/en/latest/installation.html) from pip:
```bash
pip3 install conan
```
### Setting up Conan
#### Conan Default Directory
Conan stores all files and data in `~/.conan`. If you are fine with this behavior, you can skip to [Conan Profiles](#conan-profiles) section.
To change this behavior, set the environment variable `CONAN_USER_HOME` to a path of your choice. A `.conan/` directory will be created in this path and future Conan commands will use this directory to find configuration files and download dependent packages. Packages will be downloaded into `$CONAN_USER_HOME/data`. To change the `"/data"` part of this directory, refer to the `[storage]` section of `conan.conf` file.
To make this setting persistent across terminal sessions, you can add below line to your `~/.bashrc` or custom runscript. Refer to [Conan Documentation](https://docs.conan.io/en/latest/reference/env_vars.html#conan-user-home) for more details.
```sh
export CONAN_USER_HOME=/usr/local/my_workspace/conan_cache
```
#### Conan Profiles
Profiles are a way for Conan to determine a basic environment to use for building a project. This project ships with profiles for:
- Intel(R) oneAPI DPC++ Compiler for x86 CPU and Intel GPU backend: `inteldpcpp_lnx`
1. Open the profile you wish to use from `/conan/profiles/` and set `COMPILER_PREFIX` to the path to the root folder of compiler. The root folder is the one that contains the `bin` and `lib` directories. For example, Intel(R) oneAPI DPC++ Compiler root folder for default installation on Linux is `/opt/intel/inteloneapi/compiler//linux`. User can define custom path for installing the compiler.
```ini
COMPILER_PREFIX=
```
2. You can customize the `[env]` section of the profile based on individual requirements.
3. Install configurations for this project:
```sh
# Inside
$ conan config install conan/
```
This command installs all contents of `/conan/`, most importantly profiles, to conan default directory.
*Note: If you change the profile, you must re-run the above command before you can use the new profile.*
### Building
1. Out-of-source build
```bash
# Inside
mkdir build && cd build
```
2. If you choose to build backends with the Intel(R) oneAPI Math Kernel Library, install the GPG key as mentioned here, https://software.intel.com/en-us/articles/oneapi-repo-instructions#aptpkg
3. Install dependencies
```sh
conan install .. --profile --build missing [-o =] [-o =]
```
The `conan install` command downloads and installs all requirements for the oneMKL DPC++ Interfaces project as defined in `/conanfile.py` based on the options passed. It also creates `conanbuildinfo.cmake` file that contains information about all dependencies and their directories. This file is used in top-level `CMakeLists.txt`.
`-pr | --profile `
Defines a profile for Conan to use for building the project.
`-b | --build `
Tells Conan to build or re-build a specific package. If `missing` is passed as a value, all missing packages are built. This option is recommended when you build the project for the first time, because it caches required packages. You can skip this option for later use of this command.
4. Build Project
```sh
conan build .. [--configure] [--build] [--test] # Default is all
```
The `conan build` command executes the `build()` procedure from `/conanfile.py`. Since this project uses `CMake`, you can choose to `configure`, `build`, `test` individually or perform all steps by passing no optional arguments.
5. Optionally, you can also install the package. Similar to `cmake --install . --prefix `.
```sh
conan package .. --build-folder . --install-folder
```
`-bf | --build-folder`
Tells Conan where to find the built project.
`-if | --install-folder`
Tells Conan where to install the package. It is similar to specifying `CMAKE_INSTALL_PREFIX`
*Note: For a detailed list of commands and options, refer to the [Conan Command Reference](https://docs.conan.io/en/latest/reference/commands.html).*
### Conan Build Options
#### Backend-related Options
The following `options` are available to pass on `conan install` when building the oneMKL library:
- `build_shared_libs=[True | False]`. Setting it to `True` enables the building of dynamic libraries. The default value is `True`.
- `target_domains=[]`. Setting it to `blas` or any other list of domain(s), enables building of those specific domain(s) only. If not defined, the default value is all supported domains.
- `enable_mklcpu_backend=[True | False]`. Setting it to `True` enables the building of oneMKL mklcpu backend. The default value is `True`.
- `enable_mklgpu_backend=[True | False]`. Setting it to `True` enables the building of oneMKL mklgpu backend. The default value is `True`.
- `enable_mklcpu_thread_tbb=[True | False]`. Setting it to `True` enables oneMKL on CPU with TBB threading instead of sequential. The default value is `True`.
#### Testing-related Options
- `build_functional_tests=[True | False]`. Setting it to `True` enables the building of functional tests. The default value is `True`.
#### Documentation
- `build_doc=[True | False]`. Setting it to `True` enables the building of rst files to generate HTML files for updated documentation. The default value is `False`.
*Note: For a mapping between Conan and CMake options, refer to [build options](#build-options) under the CMake section.*
### Example
#### Build oneMKL as a static library for oneMKL cpu and gpu backend:
```sh
# Inside
mkdir build && cd build
conan install .. --build missing --profile inteldpcpp_lnx -o build_shared_libs=False
conan build ..
```
---
## Building with CMake
1. Make sure you have completed [Build Setup](#build-setup).
2. Build and install all required [dependencies](#software-requirements).
Then:
- On Linux*
```bash
# Inside
mkdir build && cd build
export CXX=/bin/dpcpp;
cmake .. [-DMKL_ROOT=] \ # required only if environment variable MKLROOT is not set
[-DREF_BLAS_ROOT=] \ # required only for testing
[-DREF_LAPACK_ROOT=] # required only for testing
cmake --build .
ctest
cmake --install . --prefix
```
- On Windows*
```bash
# Inside
md build && cd build
cmake .. -G Ninja
[-DMKL_ROOT=] \ # required only if environment variable MKLROOT is not set
[-DREF_BLAS_ROOT=] \ # required only for testing
[-DREF_LAPACK_ROOT=] # required only for testing
ninja
ctest
cmake --install . --prefix
```
### Build Options
All options specified in the Conan section are available to CMake. You can specify these options using `-D=`.
The following table provides a detailed mapping of options between Conan and CMake.
Conan Option | CMake Option | Supported Values | Default Value
:---------- | :----------- | :--------------- | :---
build_shared_libs | BUILD_SHARED_LIBS | True, False | True
enable_mklcpu_backend | ENABLE_MKLCPU_BACKEND | True, False | True
enable_mklgpu_backend | ENABLE_MKLGPU_BACKEND | True, False | True
*Not Supported* | ENABLE_CUBLAS_BACKEND | True, False | False
*Not Supported* | ENABLE_CURAND_BACKEND | True, False | False
*Not Supported* | ENABLE_NETLIB_BACKEND | True, False | False
enable_mklcpu_thread_tbb | ENABLE_MKLCPU_THREAD_TBB | True, False | True
build_functional_tests | BUILD_FUNCTIONAL_TESTS | True, False | True
build_doc | BUILD_DOC | True, False | False
target_domains (list) | TARGET_DOMAINS (list) | blas, rng | All domains
*Note: `build_functional_tests` and related CMake option affects all domains at a global scope.*
---
## Project Cleanup
Most use-cases involve building the project without the need to cleanup the build directory. However, if you wish to cleanup the build directory, you can delete the `build` folder and create a new one. If you wish to cleanup the build files but retain the build configuration, following commands will help you do so. They apply to both `Conan` and `CMake` methods of building this project.
```sh
# If you use "GNU/Unix Makefiles" for building,
make clean
# If you use "Ninja" for building
ninja -t clean
```
---
## Contributing
See [CONTRIBUTING](CONTRIBUTING.md) for more information.
## License
Distributed under the Apache license 2.0. See [LICENSE](LICENSE) for more
information.
---
## FAQs
### oneMKL
1. What is the difference between the following oneMKL items?
- The [oneAPI Specification for oneMKL](https://spec.oneapi.com/versions/latest/index.html)
- The [oneAPI Math Kernel Library (oneMKL) Interfaces](https://github.com/oneapi-src/oneMKL) Project
- The [Intel(R) oneAPI Math Kernel Library (oneMKL)](https://software.intel.com/content/www/us/en/develop/tools/oneapi/components/onemkl.html) Product
Answer:
- The [oneAPI Specification for oneMKL](https://spec.oneapi.com/versions/latest/index.html) defines the DPC++ interfaces for performance math library functions. The oneMKL specification can evolve faster and more frequently than implementations of the specification.
- The [oneAPI Math Kernel Library (oneMKL) Interfaces](https://github.com/oneapi-src/oneMKL) Project is an open source implementation of the specification. The project goal is to demonstrate how the DPC++ interfaces documented in the oneMKL specification can be implemented for any math library and work for any target hardware. While the implementation provided here may not yet be the full implementation of the specification, the goal is to build it out over time. We encourage the community to contribute to this project and help to extend support to multiple hardware targets and other math libraries.
- The [Intel(R) oneAPI Math Kernel Library (oneMKL)](https://software.intel.com/content/www/us/en/develop/tools/oneapi/components/onemkl.html) product is the Intel product implementation of the specification (with DPC++ interfaces) as well as similar functionality with C and Fortran interfaces, and is provided as part of Intel® oneAPI Base Toolkit. It is highly optimized for Intel CPU and Intel GPU hardware.
### Conan
1. I am behind a proxy. How can Conan download dependencies from external network?
- `~/.conan/conan.conf` has a `[proxies]` section where you can add the list of proxies. For details refer to [Conan proxy settings](https://docs.conan.io/en/latest/reference/config_files/conan.conf.html#proxies).
2. I get an error while installing packages via APT through Conan.
```
dpkg: warning: failed to open configuration file '~/.dpkg.cfg' for reading: Permission denied
Setting up intel-oneapi-mkl-devel (2021.1-408.beta07) ...
E: Sub-process /usr/bin/dpkg returned an error code (1)
```
- Although your user session has permissions to install packages via `sudo apt`, it does not have permissions to update debian package configuration, which throws an error code 1, causing a failure in `conan install` command.
- The package is most likely installed correctly and can be verified by:
1. Running the `conan install` command again.
2. Checking `/opt/intel/inteloneapi` for `mkl` and/or `tbb` directories.
---
#### [Legal information](legal_information.md)