The Neural Chunker uses a neural network model to identify optimal chunk boundaries based on learned patterns of semantic coherence.
Examples
Text Input
from chonkie.cloud import NeuralChunker
chunker = NeuralChunker(
model="mirth/chonky_modernbert_large_1"
)
text = "Your text here..."
chunks = chunker.chunk(text)
from chonkie.cloud import NeuralChunker
chunker = NeuralChunker(
model="mirth/chonky_modernbert_large_1"
)
# Chunk from file
with open("document.txt", "rb") as f:
chunks = chunker.chunk(file=f)
Request
Parameters
The text to chunk. Can be a single string or an array of strings for batch processing. Either text or file is required.
File to chunk. Use multipart/form-data encoding. Either text or file is required.
model
string
default:"mirth/chonky_modernbert_large_1"
The neural chunking model to use.
Minimum number of characters per chunk
Response
Returns
Array of Chunk objects, each containing:
Starting character position in the original text.
Ending character position in the original text.
Number of tokens in the chunk.