You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
TODO: The current hashing function is O(L^2). We should optimize this in the future.
return hash((tuple(self.data.get_token_ids()[0:num_tokens]), self.lora_int_id))
May I ask why is O(L^2) here?
The text was updated successfully, but these errors were encountered:
For a string prompt, we do not hash it incrementally.
For example, (for illustrative purpose), string ABCD is hashed: hash("A"), hash("AB"), hash("ABC"), hash("ABCD") to produce 4 hash values and used for prefix detection.
This is O(L^2) because each hash is O(L). However, the hash can be computed incrementally so each hash is O(1), namely: hash("AB") = incremental_hash("B", hash("A"))
TODO: The current hashing function is O(L^2). We should optimize this in the future.
return hash((tuple(self.data.get_token_ids()[0:num_tokens]), self.lora_int_id))
May I ask why is O(L^2) here?
The text was updated successfully, but these errors were encountered: