split_text_on_tokens#
- langchain_text_splitters.base.split_text_on_tokens(*, text: str, tokenizer: Tokenizer) List[str] [source]#
ๅๅฒไผ ๅ ฅ็ๆๆฌๅนถไฝฟ็จๅ่ฏๅจ่ฟๅๅใ
- Parameters:
ๆๆฌ (str)
tokenizer (Tokenizer)
- Return type:
ๅ่กจ[str]