# nlp-training-and-inference-openvino **Repository Path**: mirrors_intel/nlp-training-and-inference-openvino ## Basic Information - **Project Name**: nlp-training-and-inference-openvino - **Description**: No description available - **Primary Language**: Unknown - **License**: Apache-2.0 - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2022-10-24 - **Last Updated**: 2025-09-27 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README # Overview This repository contains NLP use cases build using Intel's different AI components with a focus on the OpenVINO™ toolkit. Each use case is supported with detailed documentation present in the respective folders. | Use Case Name | Description | Folder Name | |:--------|:-------------|:-----| | Quantization Aware Training and Inference using OpenVINO™ toolkit | An End-to-End NLP workflow with Quantization Aware Training using Optimum-Intel\*, and Inference using Optimum-Intel\*, OpenVINO™ Model Server & Optimum-ONNX Runtime with OpenVINO™ Execution Provider | [question-answering-bert-qat](https://github.com/intel/nlp-training-and-inference-openvino/tree/main/question-answering-bert-qat) |