首页 > 解决方案 > 强制 spacy 不解析标点符号?

问题描述

有没有办法强制 spacy 不将标点符号解析为单独的标记???

 nlp = spacy.load('en')

 doc = nlp(u'the $O is in $R')

  [ w for w in doc ]
  : [the, $, O, is, in, $, R]

我想 :

  : [the, $O, is, in, $R]

标签: pythontokenizespacypunctuation

解决方案


就在这里。例如,

import spacy
import regex as re
from spacy.tokenizer import Tokenizer

prefix_re = re.compile(r'''^[\[\+\("']''')
suffix_re = re.compile(r'''[\]\)"']$''')
infix_re = re.compile(r'''[\(\-\)\@\.\:\$]''') #you need to change the infix tokenization rules
simple_url_re = re.compile(r'''^https?://''')

def custom_tokenizer(nlp):
    return Tokenizer(nlp.vocab, prefix_search=prefix_re.search,
                     suffix_search=suffix_re.search,
                     infix_finditer=infix_re.finditer,
                     token_match=simple_url_re.match)

nlp = spacy.load('en_core_web_sm')
nlp.tokenizer = custom_tokenizer(nlp)

doc = nlp(u'the $O is in $R')
print [w for w in doc] #prints

[the, $O, is, in, $R]

您只需要在中缀正则表达式中添加“$”字符(显然带有转义字符“\”)。

旁白:包含前缀和后缀以展示 spaCy 标记器的灵活性。在您的情况下,只需中缀正则表达式就足够了。


推荐阅读