This paper introduces a novel approach for prosthetic wrist control, addressing limitations of traditional electromyography-based methods. While previous research has primarily focused on hand and gripper development, our study emphasizes the importance of wrist mobility for enhancing dexterity and manipulation skills. Leveraging a combination of visual data and inertial sensors, we proposed a system capable of estimating object orientation in real-time, enabling automatic and natural control of a prosthetic wrist. Our deep learning-based model can accurately interpret object posture from the user’s perspective, facilitating seamless wrist movement based on object inclination. In addition, Gaussian filtering was employed to mitigate noise in image-based posture estimation while preserving essential trends. Through this approach, users can achieve natural positioning without needing additional muscle movements, thus significantly improving prosthetic usability and user experience.