Can we use knowledge distillation rather than model quantization

can we use knowledge distilation in place model quantization

Please explain why you want to use knowledge distillation in place of model quantization.
Have you tried quantizing the student after distillation?

no i just wanted to ask the question, is that possible?

Sure you can.
Do check that student model meets the acceptance criteria for your project.