Describe the bug
Looking to do a QAT of the TopdownPoseEstimator model, I skipped RTMCCHead.predict
and RTMCCHead.loss
to avoid errors during tracing. This model trains fine as a fake quantized model.
Once running the deploy with a local branch merge from the mmdeploy branch for_mmrazor to the v1.3.1 branch, and applying patches decribed in #632 #633 #634 and #637, I run into an error.
File .../mmrazor/models/algorithms/quantization/mm_architecture.py, line 362, in get_deploy_model
observed_model.load_state_dict(quantized_state_dict)
File .../torch/nn/modules/module.py, line 2041, in load_state_dict
raise RuntimeError("Error(s) in loading state_dict for {}:\n\t{}".format(
RuntimeError: Error(s) in loading state_dict for GraphModule
I suggest to use the tensor
mode instead, but i'be yet to test it.
def get_deploy_model(self):
"""Prepare for deploy to the backend with mmdeploy, which will be used
in mmdeploy, and usually includes as follows:
1. prepare for the float model rewritten by mmdeploy.
2. load checkpoint consists of float weight and quantized params in
mmrazor.
3. post process weight fakequant for exporting .onnx that meet
the backend's requirement.
"""
device = next(self.parameters()).device
- quantized_state_dict = self.qmodels['predict'].state_dict()
+ quantized_state_dict = self.qmodels['tensor'].state_dict()
fp32_model = self.architecture
self.quantizer.convert_batchnorm2d(fp32_model)
observed_model = self.quantizer.prepare(fp32_model)
observed_model.load_state_dict(quantized_state_dict)
self.quantizer.post_process_for_deploy(
observed_model,
EDIT: after testing this I got this error:
File .../mmrazor/models/algorithms/quantization/mm_architecture.py, line 376, in get_deploy_model
fakequant_new = QConfigHandler.replace_fakequant(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File .../torch/nn/modules/module.py, line 1614, in __getattr__
raise AttributeError(...
AtributeError: 'FixedQParamsObserver' object has no attribute 'min_val'
I gather this means that to fix this one needs to use the predict mode state dict, even with the missing keys, somehow.