Drones are increasingly used in various fields such as agriculture, logistics, and disaster response due to their agility and versatility. In indoor plant factories, small drones are used to monitor crop conditions and collect environmental data. However, small drones require frequent recharging due to their limited battery capacity, making autonomous charging systems essential for uninterrupted operation of drones. This study proposes an autonomous charging station designed for small drones in indoor plant factories. The system employs a wired charging mechanism to enhance charging efficiency, and a 3-degree-of-freedom (DOF) pose alignment system, utilizing an XY plotter and turntable, to correct drone landing errors. The alignment system ensures that drones, landing with random positions and orientations, are automatically adjusted to the correct position for charging. Experiments demonstrated that the charging station successfully aligned and charged drones with a 93% success rate on the first attempt. Even in cases of failure, the system automatically retried until a 100% success rate was achieved. This autonomous drone charging system has the potential to significantly enhance operational efficiency in indoor plant factories and can be adapted for various drone models in future applications.
It is difficult for a human operator to find roll, pitch, yaw (RPY) that indicates the desired direction of unmanned aerial vehicle (UAV) in a three-dimensional space. Herein, a controller for UAV was developed allowing the human operator controlling the direction of UAV without finding RPY information. The algorithm implemented in the controller automatically calculated RPY information of UAV from the normal vector of the end effector. The developed controller was designed using a parallel mechanism. The joint angles of the controller were measured using potentiometers to estimate the normal vector of the end effector. Five subjects participated in an experiment to control a vector in three-dimensional space to follow a randomly generated target vector using the developed controller and the thumb sticks. The performance of the two controllers was evaluated by two methods: measuring the required time to reduce the error between the controlled vector and the target vector to be less than 0.1 cm and calculating a normalized error between the controlled vector and the target vector after manipulating the controlled vector for 10 seconds. When using the developed controller, the difference in control ability between subjects was reduced, and both required time and normalized error were generally reduced.
Knee contact forces and knee stiffness are biomechanical factors worth considering for walking in knee osteoarthritis patients. However, it is challenging to acquire these factors in real time; thus, making it difficult to use them in robotic rehabilitation and assistive systems. This study investigated whether trained deep neural networks (DNNs) can capture the biomechanical factors only using kinematics during gait, which is possible to measure via sensors in real time. A public dataset of walking on the ground was analyzed through biomechanical analysis to train and test DNNs. Using the training dataset, several DNN topologies were explored via Bayesian optimization to tune the hyperparameters. After optimization, DNNs were trained to estimate the biomechanical factors in a supervised manner. The trained DNNs were then evaluated using two new datasets, which were not used in the training process. The trained DNNs estimated the biomechanical factors with a high level of accuracy in both types of test datasets. Results confirmed that DNNs can estimate the biomechanical factors based on only kinematics during gait.
Lower limbs deformity is a congenital disease and can also be occurred by an acquired factor. This paper suggests a new technique for surgical planning of Corrective Osteotomy for Lower Limbs (COLL) using 2D-3D medical image registration. Converting to a 3D modeling data of lower limb based on CT (computed tomography) scan, and divide it into femur, tibia and fibula; which composing the lower limb. By rearranging the model based on the biplane 2D images of Xray data, a 3D upright bone structure was acquired. There are two ways to array the 3D data on the 2D image: Intensity-based registration and feature-based registration. Even though registering Intensity-based method takes more time, this method will provide more precise results, and will improve the accuracy of surgical planning.