split_text_on_tokens#

langchain_text_splitters.base.split_text_on_tokens(*, text: str, tokenizer: Tokenizer) โ†’ List[str][source]#

ๅˆ†ๅ‰ฒไผ ๅ…ฅ็š„ๆ–‡ๆœฌๅนถไฝฟ็”จๅˆ†่ฏๅ™จ่ฟ”ๅ›žๅ—ใ€‚

Parameters:
  • ๆ–‡ๆœฌ (str)

  • tokenizer (Tokenizer)

Return type:

ๅˆ—่กจ[str]