Skip to content

Commit

Permalink
To check if the param_encodings field is present in the encoding file…
Browse files Browse the repository at this point in the history
… passed to set_and_freeze_param_encoding api and use param_encodings dict from that.

Signed-off-by: Rishabh Thakur <quic_ristha@quicinc.com>
  • Loading branch information
quic-ristha authored and Prajapati, Mohit committed Mar 12, 2024
1 parent e068c40 commit 7292545
Showing 1 changed file with 4 additions and 0 deletions.
4 changes: 4 additions & 0 deletions TrainingExtensions/torch/src/python/aimet_torch/quantsim.py
Original file line number Diff line number Diff line change
Expand Up @@ -1630,6 +1630,10 @@ def set_and_freeze_param_encodings(self, encoding_path: str):
with open(encoding_path) as json_file:
param_encodings = json.load(json_file)

# In case of full encoding file is provided, param_encoding is nested inside the top level encoding structure
if 'param_encodings' in param_encodings:
param_encodings = param_encodings['param_encodings']

for name, quant_module in self.model.named_modules():
if isinstance(quant_module, QcQuantizeWrapper):
quant_module.set_param_encoding(name, param_encodings)
Expand Down

0 comments on commit 7292545

Please sign in to comment.