seen_tokens and get_max_length depriciated
#8
by singleffabric - opened
The
seen_tokensattribute is deprecated and will be removed in v4.41. Use thecache_positionmodel input instead.AttributeError: 'DynamicCache' object has no attribute 'get_max_length'
past_length = past_key_values.cache_position
max_cache_length = past_key_values.get_max_cache_shape()
I my case (transformers v4.49) this one worked:
past_length = past_key_values.seen_tokens
max_cache_length = past_key_values.get_max_cache_shape()
I have the latest version of transformers - 4.55.0, which doesn't support any of the above API's. I created a conda env and reinstalled the required versions alone. It worked for me this way.
conda create -n deepseek_test python=3.10
conda activate deepseek_test
pip install transformers==4.36.2
pip install flash_attn