# docling **Repository Path**: mirrors_lepy/docling ## Basic Information - **Project Name**: docling - **Description**: Get your documents ready for gen AI - **Primary Language**: Unknown - **License**: MIT - **Default Branch**: main - **Homepage**: None - **GVP Project**: No ## Statistics - **Stars**: 0 - **Forks**: 0 - **Created**: 2025-10-15 - **Last Updated**: 2025-10-19 ## Categories & Tags **Categories**: Uncategorized **Tags**: None ## README
# Docling [](https://arxiv.org/abs/2408.09869) [](https://docling-project.github.io/docling/) [](https://pypi.org/project/docling/) [](https://pypi.org/project/docling/) [](https://github.com/astral-sh/uv) [](https://github.com/astral-sh/ruff) [](https://pydantic.dev) [](https://github.com/pre-commit/pre-commit) [](https://opensource.org/licenses/MIT) [](https://pepy.tech/projects/docling) [](https://apify.com/vancura/docling) [](https://app.dosu.dev/097760a8-135e-4789-8234-90c8837d7f1c/ask?utm_source=github) [](https://www.bestpractices.dev/projects/10101) [](https://lfaidata.foundation/projects/) Docling simplifies document processing, parsing diverse formats โ including advanced PDF understanding โ and providing seamless integrations with the gen AI ecosystem. ## Features * ๐๏ธ Parsing of [multiple document formats][supported_formats] incl. PDF, DOCX, PPTX, XLSX, HTML, WAV, MP3, VTT, images (PNG, TIFF, JPEG, ...), and more * ๐ Advanced PDF understanding incl. page layout, reading order, table structure, code, formulas, image classification, and more * ๐งฌ Unified, expressive [DoclingDocument][docling_document] representation format * โช๏ธ Various [export formats][supported_formats] and options, including Markdown, HTML, [DocTags](https://arxiv.org/abs/2503.11576) and lossless JSON * ๐ Local execution capabilities for sensitive data and air-gapped environments * ๐ค Plug-and-play [integrations][integrations] incl. LangChain, LlamaIndex, Crew AI & Haystack for agentic AI * ๐ Extensive OCR support for scanned PDFs and images * ๐ Support of several Visual Language Models ([GraniteDocling](https://huggingface.co/ibm-granite/granite-docling-258M)) * ๐๏ธ Audio support with Automatic Speech Recognition (ASR) models * ๐ Connect to any agent using the [MCP server](https://docling-project.github.io/docling/usage/mcp/) * ๐ป Simple and convenient CLI ### What's new * ๐ค Structured [information extraction][extraction] \[๐งช beta\] * ๐ New layout model (**Heron**) by default, for faster PDF parsing * ๐ [MCP server](https://docling-project.github.io/docling/usage/mcp/) for agentic applications * ๐ฌ Parsing of Web Video Text Tracks (WebVTT) files ### Coming soon * ๐ Metadata extraction, including title, authors, references & language * ๐ Chart understanding (Barchart, Piechart, LinePlot, etc) * ๐ Complex chemistry understanding (Molecular structures) ## Installation To use Docling, simply install `docling` from your package manager, e.g. pip: ```bash pip install docling ``` Works on macOS, Linux and Windows environments. Both x86_64 and arm64 architectures. More [detailed installation instructions](https://docling-project.github.io/docling/installation/) are available in the docs. ## Getting started To convert individual documents with python, use `convert()`, for example: ```python from docling.document_converter import DocumentConverter source = "https://arxiv.org/pdf/2408.09869" # document per local path or URL converter = DocumentConverter() result = converter.convert(source) print(result.document.export_to_markdown()) # output: "## Docling Technical Report[...]" ``` More [advanced usage options](https://docling-project.github.io/docling/usage/advanced_options/) are available in the docs. ## CLI Docling has a built-in CLI to run conversions. ```bash docling https://arxiv.org/pdf/2206.01062 ``` You can also use ๐ฅ[GraniteDocling](https://huggingface.co/ibm-granite/granite-docling-258M) and other VLMs via Docling CLI: ```bash docling --pipeline vlm --vlm-model granite_docling https://arxiv.org/pdf/2206.01062 ``` This will use MLX acceleration on supported Apple Silicon hardware. Read more [here](https://docling-project.github.io/docling/usage/) ## Documentation Check out Docling's [documentation](https://docling-project.github.io/docling/), for details on installation, usage, concepts, recipes, extensions, and more. ## Examples Go hands-on with our [examples](https://docling-project.github.io/docling/examples/), demonstrating how to address different application use cases with Docling. ## Integrations To further accelerate your AI application development, check out Docling's native [integrations](https://docling-project.github.io/docling/integrations/) with popular frameworks and tools. ## Get help and support Please feel free to connect with us using the [discussion section](https://github.com/docling-project/docling/discussions). ## Technical report For more details on Docling's inner workings, check out the [Docling Technical Report](https://arxiv.org/abs/2408.09869). ## Contributing Please read [Contributing to Docling](https://github.com/docling-project/docling/blob/main/CONTRIBUTING.md) for details. ## References If you use Docling in your projects, please consider citing the following: ```bib @techreport{Docling, author = {Deep Search Team}, month = {8}, title = {Docling Technical Report}, url = {https://arxiv.org/abs/2408.09869}, eprint = {2408.09869}, doi = {10.48550/arXiv.2408.09869}, version = {1.0.0}, year = {2024} } ``` ## License The Docling codebase is under MIT license. For individual model usage, please refer to the model licenses found in the original packages. ## LF AI & Data Docling is hosted as a project in the [LF AI & Data Foundation](https://lfaidata.foundation/projects/). ### IBM โค๏ธ Open Source AI The project was started by the AI for knowledge team at IBM Research Zurich. [supported_formats]: https://docling-project.github.io/docling/usage/supported_formats/ [docling_document]: https://docling-project.github.io/docling/concepts/docling_document/ [integrations]: https://docling-project.github.io/docling/integrations/ [extraction]: https://docling-project.github.io/docling/examples/extraction/