When selecting a motor for a machine design, even one that demands precision control, initial considerations are based on speed and torque characteristics. For applications such as robotic joint control, kinematic accuracy that relates to position and control velocity is also fundamental. Providing that the speed and torque requirements have been calculated, these criteria, along with inertial acceleration, can be selected through the motor manufacturers’ specifications.
However, in many cases, precise system power and accuracy requirements cannot be calculated until a prototype mechanical assembly has been tested. Meanwhile, initial motor selection can rely on the tribal knowledge of motors used in legacy machines of equivalent functionality, or motors can be oversized during prototyping, then downsized later when precise requirements are known.
SPEED CONTROL
The step motor is often the first consideration when specifying a motor for precision control because of its strong cost position. However, its suitability depends on speed requirements, as the maximum speed of a step motor is limited due to its higher pole count. This can be an advantage compared to servo motors though, if high torque density is required. While a stepper can provide positioning sufficient for many applications, accuracy depends on system loading as a proportion of the step motor’s torque rating. At 10% loading, positional error is approximately one quarter of a whole step, or 0.5°.
Alternatively, a servo motor offers much faster maximum speed. High speed applications, including those above 5,000 RPM, are typically rotating a balanced inertia without any external loading, such as a centrifuge. As the system accelerates, radial bearing forces are the dominant bearing load, and their impact is proportional to system eccentricity. Generating a model of the radial bearing forces to determine the scope of torque requirements is typically a feature of prototype testing.
On the other hand, if a servo motor is accelerating and decelerating with an unbalanced inertia, for example when controlling a joint in an articulating robot arm, inertial properties dominate the motor torque demand.
In terms of control accuracy, the servo motor with position feedback is the optimum choice. In most cases, a servo can settle within +/- 10 encoder counts, but this also requires an encoder with sufficient positional resolution. The servo motor’s response is also critical; theoretically, the kinematic response of the motor should be linear with torque, but static friction makes a linear response impossible when starting and stopping a move. Therefore, high accuracy systems need additional specialised mechanics to limit this effect.
Brushless DC motors (BLDC) can also be used for positional control with a feedback device. An additional encoder adds its own footprint and cost, but a BLDC motor is more efficient than a servo and offers higher torque density. It can also enable a simpler, more flexible integration approach that can aid machine design. Frameless BLDC motors can have a hollow shaft, allowing components to be placed through their centre, and their design also saves footprint and weight. These motors are often direct drive, connecting to the load without the need for transmission, which makes them highly dynamic and high-speed.
INERTIA MATCHING
Whatever motor technology is selected, matching the inertia of the motor and its load is crucial to optimise response time, and prevent operational challenges such as vibration. It is possible to operate a large inertia ratio with the advantages of a smaller motor yet still meet torque/speed requirements. However, this increases demand on power input, and careful selection is essential to avoid mechanical instability that can cause motor oscillation at higher frequencies.
Ultimately, motor specification optimisation requires a thorough understanding of the application. INMOCO’s engineers work alongside engineers from Performance Motion Devices to help achieve the most appropriate motion solution.
BOX: HOW LOAD AFFECTS MOTOR SELECTION
When selecting a miniature motor, understanding the different loads acting on the motor is key to achieving the best design. Torque load is essential to define the necessary motor power, while radial and axial loads determine optimal performance and longevity, according to Valentin Raschke, application engineer at miniature and specialty motor supplier Portescap.
In certain applications, the motor or gearbox must not only provide a certain torque to drive the load, but must also support a radial load. Examples of a radial load – that is, a force acting perpendicular to the motor shaft’s axis of rotation – include a belt drive, driving an axis parallel to the motor, or a diaphragm pump. In the latter case, a piston, mounted on the motor shaft, creates up-and-down movement to facilitate liquid flow, applying a radial load on the motor in the process.
BEARING SELECTION
Radial load is relevant due to its impact on bearing choice. For a brush DC or stepper motor, there are two standard bearing options: sleeve bearings or ball bearings. Sleeve bearings typically support a lesser radial load than ball bearings and offer a shorter lifetime, but this is offset by their reduced cost. Using sleeve bearings is sufficient for most motor applications where cost is important and low or no radial load is present. However, for applications like belt drives and diaphragm pumps, which have higher radial load demands, utilising at least one ball bearing, positioned closest to the point of load, can significantly extend lifetime.
In contrast, brushless DC motors (BLDC) typically use two ball bearings. As BLDC motors are by design equipped with electric commutation (instead of a mechanical commutation as for brush DC motors), their lifetime is mainly dependent on the reliability of the bearing system. Using ball bearings for BLDC motors therefore enables a long lifetime even at high speeds.
As part of the specification, a motor manufacturer will typically define a maximum radial dynamic force for a minimum motor lifetime at a specific speed. However, this depends on the size of the bearings used, the distance between the two ball bearings in the motor, and the point where the radial load is applied. Typically, a long motor with oversized ball bearings supports a larger radial load than a shorter motor.
DYNAMIC AND STATIS LOAD CHALLENGES
Axial load represents load bearing in the direction of the motor axis which, along with a radial load, we see for example when using a worm gear. It’s often mandatory to support significant axial loads with ball bearings rather than sleeve bearings as a result of the higher demands. However, calculating a motor’s maximum recommended dynamic axial load depends on the ball bearings used, their arrangement, and preload.
For a BLDC motor, the dynamic axial load is typically supported by the front ball bearing which is preloaded from inside the motor. If an axial push load is acting on the motor, the preload on the front ball bearing is reduced. This can lead to additional radial play, negatively impacting the lifetime, as well as causing vibration and noise. In the opposite case, an axial pull load, which acts in the same direction as the internal preload, will also increase stress.
When specifying a motion solution, it’s vital to discuss how it will integrate with the application as a whole, as different loads will have a significant impact on a motor’s performance, reliability and lifetime.