Files
loustique-home/venv/lib/python3.11/site-packages/jinja2/__pycache__/lexer.cpython-311.pyc

222 lines
35 KiB
Plaintext
Raw Normal View History

2026-03-21 10:53:02 +01:00
<EFBFBD>
<00><><EFBFBD>iZt<00> <00><><00>UdZddlZddlZddlmZddlmZddlm Z ddl
m Z ddl mZdd lmZejr
ddlZdd
lmZed <0B><00>Zejejd fed <ejd<0E><00>Zejd<0F><00>Zejdej<00><00>Zejdejej z<00><00>Z!ejdejej z<00><00>Z"e d<13><00>Z#e d<14><00>Z$e d<15><00>Z%e d<16><00>Z&e d<17><00>Z'e d<18><00>Z(e d<19><00>Z)e d<1A><00>Z*e d<1B><00>Z+e d<1C><00>Z,e d<1D><00>Z-e d<1E><00>Z.e d<1F><00>Z/e d <20><00>Z0e d!<21><00>Z1e d"<22><00>Z2e d#<23><00>Z3e d$<24><00>Z4e d%<25><00>Z5e d&<26><00>Z6e d'<27><00>Z7e d(<28><00>Z8e d)<29><00>Z9e d*<2A><00>Z:e d+<2B><00>Z;e d,<2C><00>Z<e d-<2D><00>Z=e d.<2E><00>Z>e d/<2F><00>Z?e d0<64><00>Z@e d1<64><00>ZAe d2<64><00>ZBe d3<64><00>ZCe d4<64><00>ZDe d5<64><00>ZEe d6<64><00>ZFe d7<64><00>ZGe d8<64><00>ZHe d9<64><00>ZIe d:<3A><00>ZJe d;<3B><00>ZKe d<<3C><00>ZLe d=<3D><00>ZMe d><3E><00>ZNe d?<3F><00>ZOe d@<40><00>ZPe dA<64><00>ZQe dB<64><00>ZRe dC<64><00>ZSidDe#<23>dEe;<3B>dFe'<27>dGe*<2A>dHe3<65>dIe2<65>dJe6<65>dKe<<3C>dLe.<2E>dMe8<65>dNe/<2F>dOe9<65>dPe-<2D>dQe7<65>dRe)<29>dSe4<65>dTe+<2B>e,e0e1e$e(e%e5e&e:dU<64> <09>ZTdV<64>eT<65>U<00><00>D<00><00>ZVeWeT<65><00>eWeV<65><00>ks
JdW<64><00><00>ejdNdX<64>XdY<64>eYeTdZ<64><00>[<5B><00>D<00><00><00><00><00>dO<64><03><00>ZZe[eIeKeJe=eNeOePg<07><00>Z\e[e=eQeKePg<04><00>Z]d\e^d]e^fd^<5E>Z_d_d`d]e^fda<64>Z`dbe^d]e^fdc<64>Zadde^d]ebfde<64>Zcdfdgd]ejdeje^e^ffdh<64>ZeGdi<64>dj<64><00>ZfGdk<64>d`ejg<00><00>ZhGdl<64>dm<64><00>ZiGdn<64>do<64><00>Zjdvdp<64>ZkGdq<64>drel<65><00>ZmGds<64>dtejg<00><00>ZnGdu<64>d <0C><00>ZodS)wz<77>Implements a Jinja / Python combination lexer. The ``Lexer`` class
is used to do some preprocessing. It filters out invalid operators like
the bitshift operators we don't allow in templates. It separates
template code and python code in expressions.
<EFBFBD>N)<01> literal_eval)<01>deque)<01>intern<72>)<01>pattern)<01>TemplateSyntaxError)<01>LRUCache)<01> Environment<6E>2<00>Lexer<65> _lexer_cachez\s+z (\r\n|\r|\n)z7('([^'\\]*(?:\\.[^'\\]*)*)'|"([^"\\]*(?:\\.[^"\\]*)*)")z<>
(
0b(_?[0-1])+ # binary
|
0o(_?[0-7])+ # octal
|
0x(_?[\da-f])+ # hex
|
[1-9](_?\d)* # decimal
|
0(_?0)* # decimal zero
)
z<>
(?<!\.) # doesn't start with a .
(\d+_)*\d+ # digits, possibly _ separated
(
(\.(\d+_)*\d+)? # optional fractional part
e[+\-]?(\d+_)*\d+ # exponent part
|
\.(\d+_)*\d+ # required fractional part
)
<20>add<64>assign<67>colon<6F>comma<6D>div<69>dot<6F>eq<65>floordiv<69>gt<67>gteq<65>lbrace<63>lbracket<65>lparen<65>lt<6C>lteq<65>mod<6F>mul<75>ne<6E>pipe<70>pow<6F>rbrace<63>rbracket<65>rparen<65> semicolon<6F>sub<75>tilde<64>
whitespace<EFBFBD>float<61>integer<65>name<6D>string<6E>operator<6F> block_begin<69> block_end<6E>variable_begin<69> variable_end<6E> raw_begin<69>raw_end<6E> comment_begin<69> comment_end<6E>comment<6E>linestatement_begin<69>linestatement_end<6E>linecomment_begin<69>linecomment_end<6E> linecomment<6E>data<74>initial<61>eof<6F>+<2B>-<2D>/z//<2F>*<2A>%z**<2A>~<7E>[<5B>]<5D>(<28>)<29>{<7B>}z==z!=<3D>>) z>=<3D><z<=<3D>=<3D>.<2E>:<3A>|<7C>,<2C>;c<00><00>i|]\}}||<01><02> S<00>rT)<03>.0<EFBFBD>k<>vs <20>X/home/maxime/Documents/loustiques-home/venv/lib/python3.11/site-packages/jinja2/lexer.py<70>
<dictcomp>rY<00>s<00><00>8<>8<>8<>d<EFBFBD>a<EFBFBD><11>Q<EFBFBD><01>8<>8<>8<>zoperators droppedrPc#<00>>K<00>|]}tj|<01><00>V<00><00>dS<00>N)<02>re<72>escape<70>rU<00>xs rX<00> <genexpr>ra<00>s*<00><00><00><00>P<>P<>!<21><12><19>1<EFBFBD><1C><1C>P<>P<>P<>P<>P<>PrZc<00>"<00>t|<00><00> Sr\)<01>len<65>r`s rX<00><lambda>re<00>s<00><00><03>A<EFBFBD><06><06>w<EFBFBD>rZ)<01>key<65>
token_type<EFBFBD>returnc<00><00>|tvr t|Stdtdtdtdt
dt dtdtdtdtd td
td i <0C> ||<00><00>S) Nzbegin of commentzend of commentr6zbegin of statement blockzend of statement blockzbegin of print statementzend of print statementzbegin of line statementzend of line statementztemplate data / textzend of template)<0E>reverse_operators<72>TOKEN_COMMENT_BEGIN<49>TOKEN_COMMENT_END<4E> TOKEN_COMMENT<4E>TOKEN_LINECOMMENT<4E>TOKEN_BLOCK_BEGIN<49>TOKEN_BLOCK_END<4E>TOKEN_VARIABLE_BEGIN<49>TOKEN_VARIABLE_END<4E>TOKEN_LINESTATEMENT_BEGIN<49>TOKEN_LINESTATEMENT_END<4E>
TOKEN_DATA<EFBFBD> TOKEN_EOF<4F>get)rgs rX<00>_describe_token_typerx<00>s|<00><00><11>&<26>&<26>&<26> <20><1A>,<2C>,<2C> <1C>/<2F><19>+<2B><15>y<EFBFBD><19>9<EFBFBD><19>5<><17>1<><1C>8<><1A>4<>!<21>#<<3C><1F>!8<><12>*<2A><11>$<24> <06>
<EFBFBD>c<EFBFBD>*<2A>j<EFBFBD>!<21>!<21> "rZ<00>token<65>Tokenc<00>X<00>|jtkr|jSt|j<00><00>S)z#Returns a description of the token.)<04>type<70>
TOKEN_NAME<EFBFBD>valuerx)rys rX<00>describe_tokenr<00>s'<00><00> <0C>z<EFBFBD>Z<EFBFBD><1F><1F><14>{<7B><1A> <1F><05>
<EFBFBD> +<2B> +<2B>+rZ<00>exprc<00>z<00>d|vr'|<00>dd<02><00>\}}|tkr|Sn|}t|<01><00>S)z0Like `describe_token` but for token expressions.rOr)<03>splitr}rx)r<>r|r~s rX<00>describe_token_exprr<72><00>sL<00><00>
<EFBFBD>d<EFBFBD>{<7B>{<7B><1A>j<EFBFBD>j<EFBFBD><13>a<EFBFBD>(<28>(<28> <0B><04>e<EFBFBD> <0F>:<3A> <1D> <1D><18>L<EFBFBD> <1E><14><04> <1F><04> %<25> %<25>%rZr~c<00>P<00>tt<00>|<00><00><00><00>S)zsCount the number of newline characters in the string. This is
useful for extensions that filter a stream.
)rc<00>
newline_re<EFBFBD>findall)r~s rX<00>count_newlinesr<73><00>s <00><00> <0F>z<EFBFBD>!<21>!<21>%<25>(<28>(<28> )<29> )<29>)rZ<00> environmentr
c<00>f<00>tj}t|j<00><00>t||j<00><00>ft|j<00><00>t ||j<00><00>ft|j<00><00>t||j<00><00>fg}|j <00>@|<02>
t|j <00><00>td||j <00><00>zf<03><00>|j <00>@|<02>
t|j <00><00>td||j <00><00>zf<03><00>d<04>t|d<05><06><00>D<00><00>S)zACompiles all the rules from the environment into a list of rules.Nz ^[ \t\v]*z(?:^|(?<=\S))[^\S\r\n]*c<00>"<00>g|] }|dd<01><00><02> S)rNrTr_s rX<00>
<listcomp>z!compile_rules.<locals>.<listcomp><3E>s <00><00> 7<> 7<> 7<>a<EFBFBD>A<EFBFBD>a<EFBFBD>b<EFBFBD>b<EFBFBD>E<EFBFBD> 7<> 7<> 7rZT)<01>reverse)r]r^rc<00>comment_start_stringrk<00>block_start_stringro<00>variable_start_stringrq<00>line_statement_prefix<69>appendrs<00>line_comment_prefix<69>TOKEN_LINECOMMENT_BEGIN<49>sorted)r<><00>e<>ruless rX<00> compile_rulesr<73><00>sC<00><00>
<EFBFBD> <09>A<EFBFBD> <10> <0B>0<> 1<> 1<> <1F> <0A>A<EFBFBD>k<EFBFBD>.<2E> /<2F> /<2F>
<EFBFBD> <10> <0B>.<2E> /<2F> /<2F> <1D> <0A>A<EFBFBD>k<EFBFBD>,<2C> -<2D> -<2D>
<EFBFBD> <10> <0B>1<> 2<> 2<> <20> <0A>A<EFBFBD>k<EFBFBD>/<2F> 0<> 0<>
<EFBFBD> <06>E<EFBFBD>$<13>(<28>4<> <0A> <0C> <0C><13>K<EFBFBD>5<>6<>6<>)<29><1C>q<EFBFBD>q<EFBFBD><1B>!B<>C<>C<>C<> <0E>
<EFBFBD>
<EFBFBD>
<EFBFBD><13>&<26>2<> <0A> <0C> <0C><13>K<EFBFBD>3<>4<>4<>'<27>*<2A>Q<EFBFBD>Q<EFBFBD>{<7B>/N<>-O<>-O<>O<> <0E>
<EFBFBD>
<EFBFBD>
<EFBFBD> 8<> 7<>6<EFBFBD>%<25><14>6<>6<>6<> 7<> 7<> 7<>7rZc<00>j<00>eZdZdZefdedejeddfd<06>Zde dej
edd fd
<EFBFBD>Z dS) <0B>FailurezjClass that raises a `TemplateSyntaxError` if called.
Used by the `Lexer` to specify known errors.
<20>message<67>clsrhNc<00>"<00>||_||_dSr\)r<><00> error_class)<03>selfr<66>r<>s rX<00>__init__zFailure.__init__s<00><00><1F><04> <0C><1E><04><18><18>rZ<00>lineno<6E>filenamez te.NoReturnc<00>:<00>|<00>|j||<02><00><00>r\)r<>r<>)r<>r<>r<>s rX<00>__call__zFailure.__call__ s<00><00><12><1E><1E>t<EFBFBD>|<7C>V<EFBFBD>X<EFBFBD>><3E>><3E>>rZ) <0C>__name__<5F>
__module__<EFBFBD> __qualname__<5F>__doc__r<00>str<74>t<>Typer<65><00>int<6E>Optionalr<6C>rTrZrXr<>r<><00>s<><00><00><00><00><00><00><08><08>
@S<01><1F><1F><1A><1F>!"<22><16>(;<3B>!<<3C><1F> <0A><1F><1F><1F><1F> ?<3F>s<EFBFBD>?<3F>a<EFBFBD>j<EFBFBD><13>o<EFBFBD>?<3F>-<2D>?<3F>?<3F>?<3F>?<3F>?<3F>?rZr<>c<00>Z<00>eZdZUeed<eed<eed<defd<05>Zdedefd<07>Zdedefd <09>Z d
S) rzr<>r|r~rhc<00> <00>t|<00><00>Sr\)r<00>r<>s rX<00>__str__z Token.__str__s<00><00><1D>d<EFBFBD>#<23>#<23>#rZr<>c<00>t<00>|j|krdSd|vr&|<01>dd<03><00>|j|jgkSdS)z<>Test a token against a token expression. This can either be a
token type or ``'token_type:token_value'``. This can only test
against string values and types.
TrOrF)r|r<>r~<00>r<>r<>s rX<00>testz
Token.testsF<00><00> <10>9<EFBFBD><04> <1C> <1C><17>4<EFBFBD> <0E>$<24>;<3B>;<3B><17>:<3A>:<3A>c<EFBFBD>1<EFBFBD>%<25>%<25>$<24>)<29>T<EFBFBD>Z<EFBFBD>)@<40>@<40> @<40><14>urZ<00>iterablec<00>:<00><00>t<00>fd<01>|D<00><00><00><00>S)z(Test against multiple token expressions.c3<00>B<00>K<00>|]}<01><02>|<01><00>V<00><00>dSr\)r<>)rUr<>r<>s <20>rXraz!Token.test_any.<locals>.<genexpr>&s-<00><><00><00><00>8<>8<>t<EFBFBD>4<EFBFBD>9<EFBFBD>9<EFBFBD>T<EFBFBD>?<3F>?<3F>8<>8<>8<>8<>8<>8rZ)<01>any)r<>r<>s` rX<00>test_anyzToken.test_any$s&<00><><00><12>8<>8<>8<>8<>x<EFBFBD>8<>8<>8<>8<>8<>8rZN)
r<EFBFBD>r<>r<>r<><00>__annotations__r<5F>r<><00>boolr<6C>r<>rTrZrXrzrz s<><00><00><00><00><00><00><00> <0F>K<EFBFBD>K<EFBFBD>K<EFBFBD>
<0A>I<EFBFBD>I<EFBFBD>I<EFBFBD> <0E>J<EFBFBD>J<EFBFBD>J<EFBFBD>$<24><13>$<24>$<24>$<24>$<24> <15><13> <15><14> <15> <15> <15> <15>9<>#<23>9<>$<24>9<>9<>9<>9<>9<>9rZc<00>.<00>eZdZdZd d<06>Zd
d<07>Zdefd<08>ZdS) <0B>TokenStreamIteratorz`The iterator for tokenstreams. Iterate over the stream
until the eof token is reached.
<20>stream<61> TokenStreamrhNc<00><00>||_dSr\)r<>)r<>r<>s rXr<>zTokenStreamIterator.__init__.s <00><00><1C><04> <0B> <0B> rZc<00><00>|Sr\rTr<>s rX<00>__iter__zTokenStreamIterator.__iter__1s<00><00><13> rZc<00><><00>|jj}|jtur |j<00><00><00>t
<00>t |j<00><00>|Sr\)r<><00>currentr|rv<00>close<73> StopIteration<6F>next<78>r<>rys rX<00>__next__zTokenStreamIterator.__next__4sH<00><00><14> <0B>#<23><05> <10>:<3A><19> "<22> "<22> <10>K<EFBFBD> <1D> <1D> <1F> <1F> <1F><1F> <1F> <0C>T<EFBFBD>[<5B><19><19><19><14> rZ)r<>r<>rhN)rhr<>)r<>r<>r<>r<>r<>r<>rzr<>rTrZrXr<>r<>)sa<00><00><00><00><00><00><08><08><1D><1D><1D><1D><14><14><14><14><15>%<25><15><15><15><15><15>rZr<>c<00>$<00>eZdZdZdejedejedejefd<05>Z de
fd<07>Z de fd<08>Z ede fd <09><04><00>Zd
edd fd <0C>Zdefd <0A>Zddedd fd<10>Zdedejefd<12>Zdede fd<13>Zdefd<14>Zdd<15>Zdedefd<16>Zd S)r<>z<>A token stream is an iterable that yields :class:`Token`\s. The
parser however does not iterate over it but calls :meth:`next` to go
one token ahead. The current active token is stored as :attr:`current`.
<20> generatorr+r<>c<00><><00>t|<01><00>|_t<00><00>|_||_||_d|_tdtd<03><00>|_ t|<00><00>dS)NFr<00>) <0B>iter<65>_iterr<00>_pushedr+r<><00>closedrz<00> TOKEN_INITIALr<4C>r<>)r<>r<>r+r<>s rXr<>zTokenStream.__init__EsS<00><00> <1A>)<29>_<EFBFBD>_<EFBFBD><04>
<EFBFBD>(-<2D><07><07><04> <0C><18><04> <09> <20><04> <0A><1B><04> <0B><1C>Q<EFBFBD> <0A>r<EFBFBD>2<>2<><04> <0C> <0C>T<EFBFBD>
<EFBFBD>
<EFBFBD>
<EFBFBD>
<EFBFBD>
rZrhc<00> <00>t|<00><00>Sr\)r<>r<>s rXr<>zTokenStream.__iter__Ss<00><00>"<22>4<EFBFBD>(<28>(<28>(rZc<00>P<00>t|j<00><00>p|jjtuSr\)r<>r<>r<>r|rvr<>s rX<00>__bool__zTokenStream.__bool__Vs!<00><00><13>D<EFBFBD>L<EFBFBD>!<21>!<21>G<>T<EFBFBD>\<5C>%6<>i<EFBFBD>%G<>GrZc<00><00>| S)z Are we at the end of the stream?rTr<>s rX<00>eoszTokenStream.eosYs <00><00><18>x<EFBFBD>rZryNc<00>:<00>|j<00>|<01><00>dS)z Push a token back to the stream.N)r<>r<>r<>s rX<00>pushzTokenStream.push^s<00><00> <0C> <0C><1B><1B>E<EFBFBD>"<22>"<22>"<22>"<22>"rZc<00>j<00>t|<00><00>}|j}|<00>|<02><00>||_|S)zLook at the next token.)r<>r<>r<>)r<><00> old_token<65>results rX<00>lookzTokenStream.lookbs2<00><00><18><14>J<EFBFBD>J<EFBFBD> <09><15><1C><06> <0C> <09> <09>&<26><19><19><19> <20><04> <0C><15> rZr<00>nc<00>H<00>t|<01><00>D]}t|<00><00><00>dS)zGot n tokens ahead.N)<02>ranger<65>)r<>r<><00>_s rX<00>skipzTokenStream.skipjs.<00><00><16>q<EFBFBD><18><18> <17> <17>A<EFBFBD> <10><14>J<EFBFBD>J<EFBFBD>J<EFBFBD>J<EFBFBD> <17> rZr<>c<00>X<00>|j<00>|<01><00>rt|<00><00>SdS)zqPerform the token test and return the token if it matched.
Otherwise the return value is `None`.
N)r<>r<>r<>r<>s rX<00>next_ifzTokenStream.next_ifos-<00><00> <10><<3C> <1C> <1C>T<EFBFBD> "<22> "<22> <1E><17><04>:<3A>:<3A> <1D><13>trZc<00>0<00>|<00>|<01><00>duS)z8Like :meth:`next_if` but only returns `True` or `False`.N)r<>r<>s rX<00>skip_ifzTokenStream.skip_ifxs<00><00><13>|<7C>|<7C>D<EFBFBD>!<21>!<21><14>-<2D>-rZc<00><00>|j}|jr|j<00><00><00>|_nR|jjtur? t |j<00><00>|_n$#t$r|<00><00><00>YnwxYw|S)z|Go one token ahead and return the old one.
Use the built-in :func:`next` instead of calling this directly.
) r<>r<><00>popleftr|rvr<>r<>r<>r<>)r<><00>rvs rXr<>zTokenStream.__next__|s<><00><00>
<12>\<5C><02> <0F><<3C> <1D><1F><<3C>/<2F>/<2F>1<>1<>D<EFBFBD>L<EFBFBD>L<EFBFBD> <11>\<5C> <1E>i<EFBFBD> /<2F> /<2F> <1D>#<23>D<EFBFBD>J<EFBFBD>/<2F>/<2F><04> <0C> <0C><> <20> <1D> <1D> <1D><14>
<EFBFBD>
<EFBFBD> <0C> <0C> <0C> <0C> <0C> <1D><><EFBFBD><EFBFBD><12> s<00>A<00>A=<03><A=c<00><><00>t|jjtd<01><00>|_t d<02><00>|_d|_dS)zClose the stream.r<>rTTN)rzr<>r<>rvr<>r<>r<>r<>s rXr<>zTokenStream.close<73>s2<00><00><1C>T<EFBFBD>\<5C>0<>)<29>R<EFBFBD>@<40>@<40><04> <0C><19>"<22>X<EFBFBD>X<EFBFBD><04>
<EFBFBD><1A><04> <0B> <0B> rZc<00>h<00>|j<00>|<01><00>s<>t|<01><00>}|jjtur*t d|<01>d<02>|jj|j|j<00><00><00>t d|<01>dt|j<00><00><00><02>|jj|j|j<00><00><00>t|<00><00>S)z}Expect a given token type and return it. This accepts the same
argument as :meth:`jinja2.lexer.Token.test`.
z%unexpected end of template, expected rNzexpected token z, got ) r<>r<>r<>r|rvrr<>r+r<>rr<>r<>s rX<00>expectzTokenStream.expect<63>s<><00><00><14>|<7C> <20> <20><14>&<26>&<26> <0E>&<26>t<EFBFBD>,<2C>,<2C>D<EFBFBD><13>|<7C> <20>I<EFBFBD>-<2D>-<2D>)<29>E<>D<EFBFBD>E<>E<>E<><18>L<EFBFBD>'<27><18>I<EFBFBD><18>M<EFBFBD> <12><12><12>&<26>P<>$<24>P<>P<><0E>t<EFBFBD>|<7C>0L<30>0L<30>P<>P<><14> <0C>#<23><14> <09><14> <0A> <0E><0E> <0E><14>D<EFBFBD>z<EFBFBD>z<EFBFBD>rZ)r)rhN)r<>r<>r<>r<>r<><00>Iterablerzr<>r<>r<>r<>r<>r<>r<><00>propertyr<79>r<>r<>r<>r<>r<>r<>r<>r<>r<>rTrZrXr<>r<>?s<><00><00><00><00><00><00><08><08>
<13><14>:<3A>e<EFBFBD>$<24> <13><10>j<EFBFBD><13>o<EFBFBD> <13><14>*<2A>S<EFBFBD>/<2F> <13> <13> <13> <13>)<29>-<2D>)<29>)<29>)<29>)<29>H<01>$<24>H<01>H<01>H<01>H<01><0E><18>T<EFBFBD><18><18><18><0E>X<EFBFBD><18>#<23>%<25>#<23>D<EFBFBD>#<23>#<23>#<23>#<23><16>e<EFBFBD><16><16><16><16><17><17>c<EFBFBD><17>$<24><17><17><17><17>
<14>C<EFBFBD><14>A<EFBFBD>J<EFBFBD>u<EFBFBD>$5<><14><14><14><14>.<2E>C<EFBFBD>.<2E>D<EFBFBD>.<2E>.<2E>.<2E>.<2E><12>%<25><12><12><12><12>"<1B><1B><1B><1B> <1A>3<EFBFBD><1A>5<EFBFBD><1A><1A><1A><1A><1A>rZr<>c <00><00>|j|j|j|j|j|j|j|j|j|j |j
|j f }t<00> |<01><00>}|<02>t|<00><00>xt|<}|S)z(Return a lexer which is probably cached.)r<><00>block_end_stringr<67><00>variable_end_stringr<67><00>comment_end_stringr<67>r<><00> trim_blocks<6B> lstrip_blocks<6B>newline_sequence<63>keep_trailing_newliner rwr )r<>rf<00>lexers rX<00> get_lexerr<72><00>s<><00><00> <14>&<26><13>$<24><13>)<29><13>'<27><13>(<28><13>&<26><13>)<29><13>'<27><13><1F><13>!<21><13>$<24><13>)<29> <06>C<EFBFBD> <19> <1C> <1C>S<EFBFBD> !<21> !<21>E<EFBFBD> <0C>}<7D>$)<29>+<2B>$6<>$6<>6<> <0C>S<EFBFBD><19>E<EFBFBD> <10>LrZc<00>&<00><00>eZdZdZdZ<04>fd<03>Z<05>xZS)<04>OptionalLStripzWA special tuple for marking a point in the state that can have
lstrip applied.
rTc<00>H<00><01>t<00><00><00>||<01><00>Sr\)<02>super<65>__new__)r<><00>members<72>kwargs<67> __class__s <20>rXr<>zOptionalLStrip.__new__<5F>s<00><><00><14>w<EFBFBD>w<EFBFBD><EFBFBD><EFBFBD>s<EFBFBD>G<EFBFBD>,<2C>,<2C>,rZ)r<>r<>r<>r<><00> __slots__r<5F><00> __classcell__)r<>s@rXr<>r<><00>sI<00><><00><00><00><00><00><08><08><13>I<EFBFBD>-<2D>-<2D>-<2D>-<2D>-<2D>-<2D>-<2D>-<2D>-rZr<>c<00><><00>eZdZUejeed<ejeejedfeje fed<ej
eed<dS)<06>_Ruler.<2E>tokens<6E>commandN) r<>r<>r<>r<><00>Patternr<6E>r<><00>Union<6F>Tupler<65>r<>rTrZrXrr<00>sb<00><00><00><00><00><00><00> <0E>Y<EFBFBD>s<EFBFBD>^<5E><1B><1B><1B> <0A>G<EFBFBD>C<EFBFBD><11><17><13>c<EFBFBD><18>*<2A>A<EFBFBD>G<EFBFBD>G<EFBFBD>,<<3C><<3C> =<3D>=<3D>=<3D>=<3D> <0E>Z<EFBFBD><03>_<EFBFBD><1C><1C><1C><1C>rZrc<00><><00>eZdZdZdd<06>Zdedefd<08>Z dd ed
ejed ejed ejede f
d <0A>Z
ddej ej e eefd
ejed ejedejefd<0F>Z dd ed
ejed ejed ejedejej e eeff
d<10>ZdS)r a Class that implements a lexer for a given environment. Automatically
created by the environment class, usually you don't have to do that.
Note that the lexer is not automatically bound to an environment.
Multiple environments can share the same lexer.
r<>r
rhNc<00><><00>tj}dtdtjtfd<03>}t t td<00><00>t ttd<00><00>t ttd<00><00>t ttd<00><00>t ttd<00><00>t t t"d<00><00>g}t%|<01><00>}||j<00><00>}||j<00><00>}||j<00><00>}||j<00><00>} |jrdnd}
|j|_|j|_|j|_d|<06>d|<07>d|<07>d <09>} d
<EFBFBD>| gd <0B>|D<00><00>z<00><00>} d t |d | <0C>d<0E><03><00>t9t:d<0F><00>d<0F><00>t |d<10><00>t:d<00><00>gt<t |d|<08>d|<08>d|<08>|
<EFBFBD>d <09><08><00>t>t@fd<13><00>t |d<14><00>tCd<15><00>fd<00><00>gtDt |d|<07>d|<07>d|<07>|
<EFBFBD>d<0E><08><00>tFd<13><00>g|ztHt |d| <09>d| <09><00><04><00>tJd<13><00>g|ztLt |d|<06>d|<07>d|<07>d|<07>|
<EFBFBD>d <09>
<EFBFBD><00>t9t:tN<00><00>d<13><00>t |d<14><00>tCd<1A><00>fd<00><00>gtPt |d<1B><00>tRd<13><00>g|ztTt |d<1C><00>tVtXfd<13><00>gi|_-dS)Nr`rhc<00>Z<00>tj|tjtjz<00><00>Sr\)r]<00>compile<6C>M<>Srds rX<00>czLexer.__init__.<locals>.c<>s<00><00><15>:<3A>a<EFBFBD><12><14><02><04><1B>-<2D>-<2D> -rZz\n?r<>z(?P<raw_begin>z(\-|\+|)\s*raw\s*(?:\-z\s*|z))rPc<00>&<00>g|]\}}d|<01>d|<02>d<02><05><02>S)z(?P<rKz (\-|\+|))rT)rUr<><00>rs rXr<>z"Lexer.__init__.<locals>.<listcomp>
s0<00><00>Q<>Q<>Q<>$<24>!<21>Q<EFBFBD>5<>Q<EFBFBD>5<>5<><11>5<>5<>5<>Q<>Q<>QrZ<00>rootz(.*?)(?:rH<00>#bygroupz.+z (.*?)((?:\+z|\-<2D>#popz(.)zMissing end of comment tagz(?:\+z\-z (.*?)((?:z(\-|\+|))\s*endraw\s*(?:\+zMissing end of raw directivez \s*(\n|$)z(.*?)()(?=\n|$)).r]r^r<>r<>rr<00> whitespace_re<72>TOKEN_WHITESPACE<43>float_re<72> TOKEN_FLOAT<41>
integer_re<EFBFBD> TOKEN_INTEGER<45>name_rer}<00> string_re<72> TOKEN_STRING<4E> operator_re<72>TOKEN_OPERATORr<52>r<>r<>r<>r<>r<>r<>r<>r<><00>joinr<6E>rurkrmrlr<>rorprqrr<00>TOKEN_RAW_BEGIN<49> TOKEN_RAW_ENDrsrtr<>rn<00>TOKEN_LINECOMMENT_ENDr<44>) r<>r<>r<>r<00> tag_rules<65>root_tag_rules<65>block_start_re<72> block_end_re<72>comment_end_re<72>variable_end_re<72>block_suffix_re<72> root_raw_re<72> root_parts_res rXr<>zLexer.__init__<5F>s7<00><00> <0E>I<EFBFBD><01> .<2E><13> .<2E><11><19>3<EFBFBD><1E> .<2E> .<2E> .<2E> .<2E>
<12>-<2D>!1<>4<EFBFBD> 8<> 8<> <11>(<28>K<EFBFBD><14> .<2E> .<2E> <11>*<2A>m<EFBFBD>T<EFBFBD> 2<> 2<> <11>'<27>:<3A>t<EFBFBD> ,<2C> ,<2C> <11>)<29>\<5C>4<EFBFBD> 0<> 0<> <11>+<2B>~<7E>t<EFBFBD> 4<> 4<> $
<EFBFBD> <09>'<27>{<7B>3<>3<><0E><1A><11>;<3B>9<>:<3A>:<3A><0E><18>q<EFBFBD><1B>5<>6<>6<> <0C><1A><11>;<3B>9<>:<3A>:<3A><0E><1B>!<21>K<EFBFBD>;<3B><<3C><<3C><0F>%0<>$;<3B>C<>&<26>&<26><12><0F>(<28>6<><04><1A> +<2B> <<3C><04><1D>%0<>%F<><04>"<22> 8<>n<EFBFBD> 8<> 8<>!<21> 8<> 8<>'3<> 8<> 8<> 8<> <14><1C><08><08> <18>M<EFBFBD>Q<>Q<>.<2E>Q<>Q<>Q<> Q<>
<EFBFBD>
<EFBFBD> <0A> <13><15><15>A<EFBFBD>2<>-<2D>2<>2<>2<>3<>3<>"<22>:<3A>z<EFBFBD>:<3A>:<3A><1E><12><12> <16>a<EFBFBD>a<EFBFBD><04>g<EFBFBD>g<EFBFBD>z<EFBFBD>4<EFBFBD>0<>0<> <0E> <20><15><15>A<EFBFBD>A<01>~<7E>A<01>A<01>.<2E>A<01>A<01>+<2B>A<01>-<<3C>A<01>A<01>A<01><16><16>#<23>$5<>6<><1A> <12><12><16>a<EFBFBD>a<EFBFBD><06>i<EFBFBD>i<EFBFBD>'<27>*F<>"G<>"G<>!I<>4<EFBFBD>P<>P<>
"<0E> <1E><15><15>A<EFBFBD>><3E><1C>><3E>><3E>,<2C>><3E>><3E>)<29>><3E>+:<3A>><3E>><3E>><3E><16><16>$<24><1A> <12><12> <0E><18>
<18> !<21><15><15>A<EFBFBD>B<>O<EFBFBD>B<>B<><1F>B<>B<>C<>C<>&<26><1A><12><12>#<0E><18>#<18> <1C><15><15>A<EFBFBD>?<3F>^<5E>?<3F>?<3F>!-<2D>?<3F>?<3F>2><3E>?<3F>?<3F>)<29>?<3F>+:<3A>?<3F>?<3F>?<3F><16><16>
#<23>:<3A>}<7D>=<3D>=<3D><1A><12><12><16>a<EFBFBD>a<EFBFBD><06>i<EFBFBD>i<EFBFBD>'<27>*H<>"I<>"I<>!K<>T<EFBFBD>R<>R<> <0E> &<26><15>a<EFBFBD>a<EFBFBD> <0C>o<EFBFBD>o<EFBFBD>'><3E><06>G<>G<>(<0E><18>(<18>
$<24><15><15>A<EFBFBD>(<28>)<29>)<29>&<26>(=<3D>><3E><1A><12><12>&<0E>F2
<EFBFBD><04>
<EFBFBD>
<EFBFBD>
rZr~c<00>B<00>t<00>|j|<01><00>S)z`Replace all newlines with the configured sequence in strings
and template data.
)r<>r&r<>)r<>r~s rX<00>_normalize_newlineszLexer._normalize_newlinesVs<00><00><1A>~<7E>~<7E>d<EFBFBD>3<>U<EFBFBD>;<3B>;<3B>;rZ<00>sourcer+r<><00>statec<00>~<00>|<00>||||<04><00>}t|<00>|||<03><00>||<03><00>S)z:Calls tokeniter + tokenize and wraps it in a token stream.)<03> tokeniterr<72><00>wrap)r<>r/r+r<>r0r<>s rX<00>tokenizezLexer.tokenize\s=<00><00><16><1E><1E><06><04>h<EFBFBD><05>><3E>><3E><06><1A>4<EFBFBD>9<EFBFBD>9<EFBFBD>V<EFBFBD>T<EFBFBD>8<EFBFBD><<3C><<3C>d<EFBFBD>H<EFBFBD>M<>M<>MrZr<>c#<00><>K<00>|D<00>]<5D>\}}}|tvr<01>|}|tkr t}<05>n<>|tkr t}<05>n<>|t
t fvr<01>L|tkr|<00>|<06><00>}<07>nf|dkr|}<05>n\|tkr*|}|<07>
<00><00>std|||<03><00><00><01>n'|tkr<> |<00>|dd<04><00><00><00> dd<06><00><00>d<07><00>}n<>#t$rR}t!|<08><00><00>d<08><00>d<00><00><00>} t| |||<03><00>|<08>d }~wwxYw|t&kr%t)|<06>d
d <0B><00>d <0C><00>}nG|t,kr$t/|<06>d
d <0B><00><00><00>}n|t0kr t2|}t5|||<07><00>V<00><00><01><>d S) z<>This is called with the stream as returned by `tokenize` and wraps
every token in a :class:`Token` and converts the value.
<20>keywordzInvalid character in identifierr<00><><EFBFBD><EFBFBD><EFBFBD><EFBFBD>ascii<69>backslashreplacezunicode-escaperONr<4E>r<>r)<1B>ignored_tokensrsrortrpr!r"rur.r}<00> isidentifierrr<00>encode<64>decode<64> Exceptionr<6E>r<><00>striprr<><00>replacerrr<00> operatorsrz)
r<EFBFBD>r<>r+r<>r<>ry<00> value_strr~r<><00>msgs
rXr3z
Lexer.wrapgs(<00><00><00><00>)/<2F>+ .<2E>+ .<2E> $<24>F<EFBFBD>E<EFBFBD>9<EFBFBD><14><0E>&<26>&<26><18>$<24>E<EFBFBD><14>1<>1<>1<>)<29><05><05><16>1<>1<>1<>'<27><05><05><16>?<3F>M<EFBFBD>:<3A>:<3A>:<3A><18><16>*<2A>$<24>$<24><1C>0<>0<><19>;<3B>;<3B><05><05><16>)<29>#<23>#<23>!<21><05><05><16>*<2A>$<24>$<24>!<21><05><1C>)<29>)<29>+<2B>+<2B><16>-<2D>9<>6<EFBFBD>4<EFBFBD><18><16><16><16><16><17>,<2C>&<26>&<26>R<01><1C>0<>0<><19>1<EFBFBD>R<EFBFBD>4<EFBFBD><1F>A<>A<><1F><16><07>);<3B><<3C><<3C><1F><16> 0<>1<>1<><1A>E<EFBFBD><45>
!<21>R<01>R<01>R<01><1D>a<EFBFBD>&<26>&<26>,<2C>,<2C>s<EFBFBD>+<2B>+<2B>B<EFBFBD>/<2F>5<>5<>7<>7<>C<EFBFBD>-<2D>c<EFBFBD>6<EFBFBD>4<EFBFBD><18>J<>J<>PQ<50>Q<><51><EFBFBD><EFBFBD><EFBFBD>R<01><><EFBFBD><EFBFBD><17>-<2D>'<27>'<27><1B>I<EFBFBD>-<2D>-<2D>c<EFBFBD>2<EFBFBD>6<>6<><01>:<3A>:<3A><05><05><16>+<2B>%<25>%<25>$<24>Y<EFBFBD>%6<>%6<>s<EFBFBD>B<EFBFBD>%?<3F>%?<3F>@<40>@<40><05><05><16>.<2E>(<28>(<28>!<21>)<29>,<2C><05><17><06><05>u<EFBFBD>-<2D>-<2D> -<2D> -<2D> -<2D> -<2D>W+ .<2E>+ .s<00>>AD<02>
E<05> A E<05>Ec#<00>
K<00>t<00>|<01><00>ddd<02>}|js|ddkr|d=d<05>|<05><00>}d}d}dg}|<04>,|dkr&|d vs
Jd
<EFBFBD><00><00>|<08>|d z<00><00>|j|d} t |<01><00>}
g} d} d } | D<00>]2\}}}|<0E>||<06><00>}|<11><01> | r|tttfvr<01>9t|t<00><00><00>r|<11> <00><00>}t|t<00><00><00>r|d}td <0A>|ddd<02>D<00><00><00><00>}|dkrL|<13><00><00>}|t |<15><00>d<01><00>d<05><00>} |g|dd<01><00>}n<>|dkr<>|jr||<11><00><00><00>t*<00><00>sP|<13>d<05><00>dz}|dks| r0t.<00>||<16><00>r|d|<16>g|dd<01><00>}t3|<0F><00>D]<5D>\}}t|t4<00><00>r |||<03><00><00>|dkrb|<11><00><00><00><00><00>D](\}}|<1A>!|||fV<00>||<1A>d<05><00>z }n<13>)t9|<0E>d<11><02><00><00><01><>||}|s |t:vr|||fV<00>||<1B>d<05><00>| zz }d} <0C><>n<EFBFBD>|<11><00><00>}|t>kr<>|dkr| <0B>d<13><00>n<>|dkr| <0B>d<15><00>nk|dkr| <0B>d<17><00>nO|dvrK| stAd|<1B>d<1A>|||<03><00><00>| <0B>!<00><00>}||krtAd|<1B>d|<1C>d<1A>|||<03><00><00>|s |t:vr|||fV<00>||<1B>d<05><00>z }|<11><00><00>dd<01>dk} |<11>"<00><00>}|<10><>|dkr|<08>!<00><00>ns|dkrX|<11><00><00><00><00><00>D]\}}|<1A>|<08>|<19><00>n<13>t9|<0E>d<1D><02><00><00>n|<08>|<10><00>|j|d} n||krt9|<0E>d<1E><02><00><00>|}n&||
krdStAd||<00>d |<06><00>|||<03><00><00><01><04>^)!aThis method tokenizes the text and returns the tokens in a
generator. Use this method if you just want to tokenize a template.
.. versionchanged:: 3.0
Only ``\n``, ``\r\n`` and ``\r`` are treated as line
breaks.
N<>r7r<><00>
rrr)<02>variable<6C>blockz invalid state<74>_beginTc3<00>K<00>|]}|<01>|V<00><00> dSr\rT)rU<00>gs rXraz"Lexer.tokeniter.<locals>.<genexpr><3E>s"<00><00><00><00>)S<>)S<><01>Q<EFBFBD>]<5D>!<21>]<5D>]<5D>]<5D>]<5D>)S<>)SrZr@r?rz= wanted to resolve the token dynamically but no group matchedrIrJrGrHrErF)rJrHrFz unexpected '<27>'z ', expected 'rzA wanted to resolve the new state dynamically but no group matchedz* yielded empty string without stack changezunexpected char z at )#r<>r<>r<>r r<>r<>rc<00>matchrrrprt<00>
isinstance<EFBFBD>tuple<6C>groupsr<73>r<><00>rstrip<69>countr<74><00> groupdictrwrq<00>rfindr<00> fullmatch<63> enumerater<65><00>items<6D> RuntimeError<6F>ignore_if_empty<74>grouprr<00>pop<6F>end)r<>r/r+r<>r0<00>lines<65>posr<73><00>stack<63> statetokens<6E> source_length<74>balancing_stack<63>newlines_stripped<65> line_starting<6E>regexr<00> new_state<74>mrP<00>text<78>
strip_sign<EFBFBD>stripped<65>l_pos<6F>idxryrfr~r<<00> expected_op<6F>pos2s rXr2zLexer.tokeniter<65>se<00><00><00><00><1B> <20> <20><16>(<28>(<28><13><13>1<EFBFBD><13>-<2D><05><13>)<29> <1A>e<EFBFBD>B<EFBFBD>i<EFBFBD>2<EFBFBD>o<EFBFBD>o<EFBFBD><15>b<EFBFBD> <09><15><19><19>5<EFBFBD>!<21>!<21><06><0F><03><12><06><17><08><05> <10> <1C><15>&<26><1F><1F><18>1<>1<>1<>1<>?<3F>1<>1<>1<> <11>L<EFBFBD>L<EFBFBD><15><18>)<29> *<2A> *<2A> *<2A><1A>j<EFBFBD><15>r<EFBFBD><19>+<2B> <0B><1B>F<EFBFBD> <0B> <0B> <0A>')<29><0F><1D><19><1C> <0A>e <12>,7<>c <12>c <12>(<28><05>v<EFBFBD>y<EFBFBD><19>K<EFBFBD>K<EFBFBD><06><03>,<2C>,<2C><01><15>9<EFBFBD><1C> #<23><1D>v<EFBFBD>&<26>#<23>+<2B>2<12>(<12>(<12>
<1D><1E>f<EFBFBD>e<EFBFBD>,<2C>,<2C>^/<2F>./<2F>h<EFBFBD>h<EFBFBD>j<EFBFBD>j<EFBFBD>F<EFBFBD>!<21>&<26>.<2E>9<>9<>I<01> &<26>a<EFBFBD>y<EFBFBD><04>&*<2A>)S<>)S<>V<EFBFBD>A<EFBFBD>D<EFBFBD>q<EFBFBD>D<EFBFBD>\<5C>)S<>)S<>)S<>%S<>%S<>
<EFBFBD>%<25><13>,<2C>,<2C>'+<2B>{<7B>{<7B>}<7D>}<7D>H<EFBFBD>04<30>S<EFBFBD><18>]<5D>]<5D>_<EFBFBD>_<EFBFBD>0E<30>0K<30>0K<30>D<EFBFBD>0Q<30>0Q<30>-<2D>&.<2E>%<<3C><16><01><02><02><1A>%<<3C>F<EFBFBD>F<EFBFBD>'<27>#<23>-<2D>-<2D> $<24> 2<>.<2E>%&<26>K<EFBFBD>K<EFBFBD>M<EFBFBD>M<EFBFBD>$5<>$5<>6J<36>$K<>$K<> .<2E>%)<29>J<EFBFBD>J<EFBFBD>t<EFBFBD>$4<>$4<>q<EFBFBD>$8<>E<EFBFBD>$<24>q<EFBFBD>y<EFBFBD>y<EFBFBD>M<EFBFBD>y<EFBFBD>$1<>#:<3A>#:<3A>4<EFBFBD><15>#G<>#G<>!I<01>.2<EFBFBD>6<EFBFBD>E<EFBFBD>6<EFBFBD>l<EFBFBD>-H<>V<EFBFBD>A<EFBFBD>B<EFBFBD>B<EFBFBD>Z<EFBFBD>-H<>F<EFBFBD>&/<2F><06>&7<>&7<>2<>2<>
<EFBFBD><03>U<EFBFBD>%<25>e<EFBFBD>W<EFBFBD>5<>5<>2<>"'<27>%<25><06><08>"9<>"9<>9<>#<23>j<EFBFBD>0<>0<>./<2F>k<EFBFBD>k<EFBFBD>m<EFBFBD>m<EFBFBD>.A<>.A<>.C<>.C<> "<22> "<22>
<EFBFBD><03>U<EFBFBD>#(<28>#4<>*0<>#<23>u<EFBFBD>*<<3C>$<<3C>$<<3C>$<<3C>$*<2A>e<EFBFBD>k<EFBFBD>k<EFBFBD>$<24>.?<3F>.?<3F>$?<3F>F<EFBFBD>$)<29>E<EFBFBD>$5<>
'3<>',<2C>%<<3C>%<<3C>%<<3C>'"<22>'"<22>!"<22>%*<2A>$*<2A>#<23>;<3B>D<EFBFBD>#<23>:<3A>u<EFBFBD>O<EFBFBD>'C<>'C<>&,<2C>e<EFBFBD>T<EFBFBD>&9<> 9<> 9<> 9<>"<22>d<EFBFBD>j<EFBFBD>j<EFBFBD><14>&6<>&6<>9J<39>&J<>J<>F<EFBFBD>01<30>-<2D>-<2D>52<><<1D>7<EFBFBD>7<EFBFBD>9<EFBFBD>9<EFBFBD>D<EFBFBD><1E><1E>/<2F>/<2F><1F>3<EFBFBD>;<3B>;<3B>+<2B>2<>2<>3<EFBFBD>7<>7<>7<>7<>!<21>S<EFBFBD>[<5B>[<5B>+<2B>2<>2<>3<EFBFBD>7<>7<>7<>7<>!<21>S<EFBFBD>[<5B>[<5B>+<2B>2<>2<>3<EFBFBD>7<>7<>7<>7<>!<21>_<EFBFBD>4<>4<>#2<>"<22>&9<>$:<3A>4<EFBFBD>$:<3A>$:<3A>$:<3A>F<EFBFBD>D<EFBFBD>(<28>'"<22>'"<22>!"<22>+:<3A>*=<3D>*=<3D>*?<3F>*?<3F>K<EFBFBD>*<2A>d<EFBFBD>2<>2<>&9<>$T<>4<EFBFBD>$T<>$T<>k<EFBFBD>$T<>$T<>$T<>$*<2A>$(<28>$,<2C> '"<22>'"<22>!"<22><1C>3<>v<EFBFBD>_<EFBFBD><<3C><<3C>$<24>f<EFBFBD>d<EFBFBD>2<>2<>2<>2<><1A>d<EFBFBD>j<EFBFBD>j<EFBFBD><14>.<2E>.<2E>.<2E>F<EFBFBD> !<21><07><07> <09> <09>"<22>#<23>#<23><0E>$<24> 6<> <0A><19>u<EFBFBD>u<EFBFBD>w<EFBFBD>w<EFBFBD><04><1D>(<28> <20>F<EFBFBD>*<2A>*<2A><1D> <09> <09> <0B> <0B> <0B> <0B>"<22>j<EFBFBD>0<>0<>*+<2B>+<2B>+<2B>-<2D>-<2D>*=<3D>*=<3D>*?<3F>*?<3F><1E><1E>J<EFBFBD>C<EFBFBD><15>$<24>0<> %<25> <0C> <0C>S<EFBFBD> 1<> 1<> 1<> %<25><05> 1<>#/<2F>#(<28>!9<>!9<>!9<>#<1E>#<1E><1E>!&<26><1E> <0C> <0C>Y<EFBFBD>/<2F>/<2F>/<2F>"&<26>*<2A>U<EFBFBD>2<EFBFBD>Y<EFBFBD>"7<>K<EFBFBD>K<EFBFBD><1A>S<EFBFBD>[<5B>[<5B>&<26> <20>N<>N<>N<><16><16><16>
<1B><03><15><05>
<17>-<2D>'<27>'<27><1A>F<EFBFBD>*<2A>?<3F>v<EFBFBD>c<EFBFBD>{<7B>?<3F>?<3F>#<23>?<3F>?<3F><16><14>x<EFBFBD><12><12><12>Ge rZ)r<>r
rhN)NNN)NN)r<>r<>r<>r<>r<>r<>r.r<>r<>r<>r4r<>rr<><00>Iteratorrzr3r2rTrZrXr r <00>s<><00><00><00><00><00><00><08><08>u
<EFBFBD>u
<EFBFBD>u
<EFBFBD>u
<EFBFBD>n<<3C><13><<3C><13><<3C><<3C><<3C><<3C>!%<25>$(<28>!%<25> N<01> N<01><13> N<01><10>j<EFBFBD><13>o<EFBFBD> N<01><14>*<2A>S<EFBFBD>/<2F> N<01>
<11>z<EFBFBD>#<23><EFBFBD> N<01>
<15> N<01> N<01> N<01> N<01>!%<25>$(<28> 4.<2E>4.<2E><11>
<EFBFBD>1<EFBFBD>7<EFBFBD>3<EFBFBD><03>S<EFBFBD>=<3D>1<>2<>4.<2E><10>j<EFBFBD><13>o<EFBFBD>4.<2E><14>*<2A>S<EFBFBD>/<2F> 4.<2E>
<0B><1A>E<EFBFBD> <1A> 4.<2E>4.<2E>4.<2E>4.<2E>t%)<29>!%<25> G<12>G<12><13>G<12><10>j<EFBFBD><13>o<EFBFBD>G<12><14>*<2A>S<EFBFBD>/<2F> G<12>
<11>z<EFBFBD>#<23><EFBFBD> G<12>
<0B><1A>A<EFBFBD>G<EFBFBD>C<EFBFBD><13>c<EFBFBD>M<EFBFBD>*<2A> +<2B> G<12>G<12>G<12>G<12>G<12>GrZ)r<>r
rhr )pr<70>r]<00>typingr<67><00>astr<00> collectionsr<00>sysr<00> _identifierrr<00>
exceptionsr<00>utilsr <00> TYPE_CHECKING<4E>typing_extensions<6E>ter<65>r
r <00>MutableMappingrr<>r rr<>rr<00>
IGNORECASE<EFBFBD>VERBOSErr<00> TOKEN_ADD<44> TOKEN_ASSIGN<47> TOKEN_COLON<4F> TOKEN_COMMA<4D> TOKEN_DIV<49> TOKEN_DOT<4F>TOKEN_EQ<45>TOKEN_FLOORDIV<49>TOKEN_GT<47>
TOKEN_GTEQ<EFBFBD> TOKEN_LBRACE<43>TOKEN_LBRACKET<45> TOKEN_LPAREN<45>TOKEN_LT<4C>
TOKEN_LTEQ<EFBFBD> TOKEN_MOD<4F> TOKEN_MUL<55>TOKEN_NE<4E>
TOKEN_PIPE<EFBFBD> TOKEN_POW<4F> TOKEN_RBRACE<43>TOKEN_RBRACKET<45> TOKEN_RPAREN<45>TOKEN_SEMICOLON<4F> TOKEN_SUB<55> TOKEN_TILDErrrr}rrrorprqrrr!r"rkrlrmrsrtr<>r#rnrur<>rvrArWrjrcr r<>r<00> frozensetr:rYr<>rxrr<>r<>r<><00>Listr<74>r<><00>
NamedTuplerzr<>r<>r<>rOr<>rr rTrZrX<00><module>r<>s1<00><01><04><04><04> 
<EFBFBD> <09> <09> <09><12><12><12><12><1C><1C><1C><1C><1C><1C><1D><1D><1D><1D><1D><1D><16><16><16><16><16><16>+<2B>+<2B>+<2B>+<2B>+<2B>+<2B>+<2B>+<2B>+<2B>+<2B>+<2B>+<2B><1B><1B><1B><1B><1B><1B><04>?<3F>)<29>"<22>"<22>"<22>"<22>(<28>(<28>(<28>(<28>(<28>(<28>4<<3C>8<EFBFBD>B<EFBFBD><<3C><<3C> <0C>a<EFBFBD><1E>q<EFBFBD>w<EFBFBD><07>/<2F>0<>?<3F>?<3F>?<3F><1B><02>
<EFBFBD>6<EFBFBD>"<22>"<22> <0A> <17>R<EFBFBD>Z<EFBFBD><0F> (<28> (<28>
<EFBFBD> <16>B<EFBFBD>J<EFBFBD>B<>B<EFBFBD>D<EFBFBD> <02> <02> <09><18>R<EFBFBD>Z<EFBFBD> <08><07>M<EFBFBD>B<EFBFBD>J<EFBFBD><1E><02><02>
<EFBFBD> <16>2<EFBFBD>:<3A> <08><07>M<EFBFBD>B<EFBFBD>J<EFBFBD><1E> <02> <02><08> <13>F<EFBFBD>5<EFBFBD>M<EFBFBD>M<EFBFBD> <09><15>v<EFBFBD>h<EFBFBD><1F><1F> <0C><14>f<EFBFBD>W<EFBFBD>o<EFBFBD>o<EFBFBD> <0B><14>f<EFBFBD>W<EFBFBD>o<EFBFBD>o<EFBFBD> <0B> <12>F<EFBFBD>5<EFBFBD>M<EFBFBD>M<EFBFBD> <09> <12>F<EFBFBD>5<EFBFBD>M<EFBFBD>M<EFBFBD> <09> <11>6<EFBFBD>$<24><<3C><<3C><08><17><16>
<EFBFBD>#<23>#<23><0E> <11>6<EFBFBD>$<24><<3C><<3C><08> <13>V<EFBFBD>F<EFBFBD>^<5E>^<5E>
<EFBFBD><15>v<EFBFBD>h<EFBFBD><1F><1F> <0C><17><16>
<EFBFBD>#<23>#<23><0E><15>v<EFBFBD>h<EFBFBD><1F><1F> <0C> <11>6<EFBFBD>$<24><<3C><<3C><08> <13>V<EFBFBD>F<EFBFBD>^<5E>^<5E>
<EFBFBD> <12>F<EFBFBD>5<EFBFBD>M<EFBFBD>M<EFBFBD> <09> <12>F<EFBFBD>5<EFBFBD>M<EFBFBD>M<EFBFBD> <09> <11>6<EFBFBD>$<24><<3C><<3C><08> <13>V<EFBFBD>F<EFBFBD>^<5E>^<5E>
<EFBFBD> <12>F<EFBFBD>5<EFBFBD>M<EFBFBD>M<EFBFBD> <09><15>v<EFBFBD>h<EFBFBD><1F><1F> <0C><17><16>
<EFBFBD>#<23>#<23><0E><15>v<EFBFBD>h<EFBFBD><1F><1F> <0C><18>&<26><1B>%<25>%<25><0F> <12>F<EFBFBD>5<EFBFBD>M<EFBFBD>M<EFBFBD> <09><14>f<EFBFBD>W<EFBFBD>o<EFBFBD>o<EFBFBD> <0B><19>6<EFBFBD>,<2C>'<27>'<27><10><14>f<EFBFBD>W<EFBFBD>o<EFBFBD>o<EFBFBD> <0B><16><06>y<EFBFBD>!<21>!<21> <0A> <13>V<EFBFBD>F<EFBFBD>^<5E>^<5E>
<EFBFBD><15>v<EFBFBD>h<EFBFBD><1F><1F> <0C><17><16>
<EFBFBD>#<23>#<23><0E><1A>F<EFBFBD>=<3D>)<29>)<29><11><18>&<26><1B>%<25>%<25><0F><1D>v<EFBFBD>.<2E>/<2F>/<2F><14><1B>V<EFBFBD>N<EFBFBD>+<2B>+<2B><12><18>&<26><1B>%<25>%<25><0F><16><06>y<EFBFBD>!<21>!<21> <0A><1C>f<EFBFBD>_<EFBFBD>-<2D>-<2D><13><1A>F<EFBFBD>=<3D>)<29>)<29><11><16><06>y<EFBFBD>!<21>!<21> <0A>"<22>F<EFBFBD>#8<>9<>9<><19> <20>&<26>!4<>5<>5<><17> <20>&<26>!4<>5<>5<><17><1E><06>0<>1<>1<><15><1A>F<EFBFBD>=<3D>)<29>)<29><11> <13>V<EFBFBD>F<EFBFBD>^<5E>^<5E>
<EFBFBD><16><06>y<EFBFBD>!<21>!<21> <0A> <12>F<EFBFBD>5<EFBFBD>M<EFBFBD>M<EFBFBD> <09> <02><07><19> <02><07><19> <02><08><19> <02> <09>.<2E>  <02>
<08><19>  <02> <08><19>  <02> <09>)<29> <02><08><1B> <02><08><1E> <02><08><1E> <02><08><1C> <02><08><1C> <02><08><1C> <02><08><1C> <02> <09>(<28> <02>  <09>(<28>! <02>"<08><18># <02>$ <15> <11>
<14> <15> <12> <14> <13> <14> <18>5 <02> <02> <02> <09>:9<>8<>i<EFBFBD>o<EFBFBD>o<EFBFBD>&7<>&7<>8<>8<>8<><11>
<EFBFBD>s<EFBFBD>9<EFBFBD>~<7E>~<7E><13><13>.<2E>/<2F>/<2F>/<2F>/<2F>/<2F>1D<31>/<2F>/<2F>/<2F><18>b<EFBFBD>j<EFBFBD>S<><03><08><08>P<>P<>v<EFBFBD>v<EFBFBD>i<EFBFBD>=N<>=N<>'O<>'O<>'O<>P<>P<>P<>P<>P<>S<>S<>S<><02><02> <0B><1B><19><1B><15><19><18><1F><1D><19><06>
<02>
<02><0E><1C>)<29><15>z<EFBFBD>=<3D>2C<32>D<><02><02><0F>
"<22>S<EFBFBD>"<22>S<EFBFBD>"<22>"<22>"<22>"<22>(,<2C>'<27>,<2C>c<EFBFBD>,<2C>,<2C>,<2C>,<2C>
&<26>c<EFBFBD>
&<26>c<EFBFBD>
&<26>
&<26>
&<26>
&<26>*<2A>#<23>*<2A>#<23>*<2A>*<2A>*<2A>*<2A>&8<>}<7D>&8<><11><16><01><07><03>S<EFBFBD><08>8I<38>1J<31>&8<>&8<>&8<>&8<>R ?<3F> ?<3F> ?<3F> ?<3F> ?<3F> ?<3F> ?<3F> ?<3F>9<>9<>9<>9<>9<>A<EFBFBD>L<EFBFBD>9<>9<>9<>8<15><15><15><15><15><15><15><15>,j<1A>j<1A>j<1A>j<1A>j<1A>j<1A>j<1A>j<1A>Z<11><11><11><11>0
-<2D>
-<2D>
-<2D>
-<2D>
-<2D>U<EFBFBD>
-<2D>
-<2D>
-<2D><1D><1D><1D><1D><1D>A<EFBFBD>L<EFBFBD><1D><1D><1D> M<12>M<12>M<12>M<12>M<12>M<12>M<12>M<12>M<12>MrZ