The usual solution is a proportional-integral-differential (PDI) controller driven by position measurements.
The system has a resonant frequency that depends on the damping, mass of the motor, and brush friction. Those set a hard limit on the maximum possible speed.
PDI will get you close to that speed, but there's always going to be a trade-off between speed and accuracy.
From the test, you're still quite a way out from perfect repeatability. (I prefer the automated version, because the smoother strokes look slightly better.)
I suspect you'll find that PDI with a thicker medium like oils/acrylics will be harder because the brush friction will be less consistent. And in fact real paintings are often created with a range of brush sizes and perhaps a range of palette knives - so building a robot to handle all of that isn't going to be trivial.
The first time you wrote "PDI", I assumed it was a typo for "PID", but you kept writing it. I'm curious, what's your background that you know it as "PDI"?
You can get good 1/32 drivers like the DRV8825 (https://www.pololu.com/product/2133) for not too much - I'm pretty confident they'll be able to do what you need (You can buy knockoffs for even cheaper). Even 1/16 is pretty dang smooth.
It also might be that you're supplying too much or too little current to the motors - both will result in jerkiness and noise. If your current drivers are current limiting, I'd look into tuning that. If they aren't, the ones I linked are.
They are current limiting and can go down to 1/16 step I believe. However, even at the current settings the motors can't keep up if the artist whips their arm from one side to the other.
But why can't the artist do their work in real time and get feedback digitally from the monitor and you then playback their strokes at a later time to do all the rendering/plotting. I can see how they may like to use the plotter medium for their feedback but all the noise I would think would be distracting.
That's definitely something we're working on. To be honest it's not very hard. I'd just modify the capture script to capture pen and pressure input and turn it into vectors, instead of capturing movement from the robot like it is now.
The problem though is that the actual brush strokes are different from a brush stroke generated in Photoshop. Controlling the machine lets them see the final output in real time, so they can paint it more like a real physical painting, and then the robot can reproduce the physical painting.
I think in the next demo it'll be more evident the difference between lines on the computer and physically rendered brush strokes.
Aha, yes in the prototyping stage that sort of immediate feedback would prove useful to converge upon the modeling necessary to properly translate the brush stroke pressure from the tablet into accurate saturation on the canvas. My comment was assuming that sort of translation was in place, but if you're discovering it then they would have to train the device. Which is an interesting thing, maybe apply machine learning to train the device much like how Andrew Ng and team trained that helicopter to fly upside down.
Also I tried to design this based on real-life applications. Digital artwork is never designed to be produced physically, and physical artwork is done physically with real-time feedback loop from observing the brushwork and colors.
That being said, turning digital artwork into physical is still an interesting problem domain since artists can't be expected to all come wield my machine. I'd like an artist to be able to remotely work the machine in some manner.
I've had luck using gears to introduce a reduction in movement by increasing the final drive ratio. The motors will move at faster speeds but the gears slow down the process and allow the movement to be more precise. Dunno how it would affect your robot since you have a direct 1:1 connection with the tablet.
Yea that's the main problem. Ramping the motor is a solved problem, but because I'm getting real time data and latency is an issue, the microcontroller doesn't know if the next step is going to stop or change direction or speed up.
I've never had good luck with the cheap ones. I use the numa with a built in transmission. Here is a link as a an example, I've never bought from the his place :http://omc-stepperonline.kancart.com/categories/4#!1
The description says it's good for "low rotational speeds." It seems to have a very small step angle. Would it be fast enough to drive smooth 1:1 motion? For example, the current motor is really smooth if the movement area is small enough.
How are you creating your ramping? Most low end CNC controllers do constant-acceleration ramping. If you plot the velocity over time, it looks like a trapezoid. This is obviously easier computation for a microcontroller, but at the points you change from constant acceleration to constant velocity, you have spike in jerk, which is the derivative of acceleration. It is these points which are creating the jolts in the system.
A better ramping scheme is to use constant-jerk acceleration. If you plot velocity over time, the acceleration and deceleration segments look like the letter 'S', which is why this is also referred to as S curve ramping.
The gist of it is that an axis has some maximum acceleration it can achieve, Amax. Accelerating faster than Amax in a stepper motor means that the motor cant execute all of the steps you send it. To achieve constant jerk acceleration, you increase the acceleration constantly until you've reached Amax. Then you accelerate at constant acceleration for some time. Finally, you decrease your acceleration constantly until it reaches 0. Obviously, you want this ceasing of acceleration to coincide with achieving the desired velocity. You first find the two constant-jerk sections times, and then deduce the constant-acceleration section's time. The amount of jerk used is determined by how much extra time you're willing to spend in comparison to constant-acceleration ramping.
The downside is increased computation. Constant-acceleration ramping only needs to solve for time in a 2nd order equation, which is just the quadratic formula. Constant-jerk ramping must solve for time in a 3rd order equation, a much more computationally intensive task. This explains why its normally a feature only found in high end CNC controllers.
I wish I could give you a definite answer. You are going to get a motor and test. Feel free to continue the conversation over email. Ill do my best to help.
The hardware is http://www.makeblock.cc/xy-plotter-robot-kit/ sans electronics. I ended up purchasing the electronics anyways because they have a modified Arduino that uses RJ25, which makes the wiring much more stable (as opposed to pins stuck into an Arduino).
If you got the individual pieces yourself or 3D printed it can be much cheaper.
The main electrical components are:
1 x Arduino Uno (I used the modified Uno called "Orion" that has RJ2 ports)
2 x Stepper motors
1 x Servo motor
4 x microswitches as limit switches
2 x stepper motor drivers (they handle the microstepping)
3 x RJ25 breakout boards (for connecting the limit switches to RJ25 ports)
This was a prototype to prove the concept. I'm going to experiment with 3D printed parts next in a new design.
If you're feeding in coordinates as Gcode I would recommend loading Grbl. Keep in mind Grbl has specific pin requirements that make it incompatible with the Orion's RJ25 ports, so you'd have to wire a normal Arduino. Makeblock supplies Gcode firmware that works with the Orion.
For the demo in the video, the firmware was custom to handle real-time input and recording of motion.
Not sure from the video, does the artist look at the robot when painting or at a screen? I feel like it's going to be really hard to paint that slow, so it might be better to simulate the whole thing (which will be way closer to real time) and then reproduce it later on the robot (with some corrections if needed). But simulating might be difficult... or lead the artist to use brush strokes that are really hard to replicate. Cool project anyway! Let us know how it evolves :)
Also just a heads up on the Amanufactory website the font in the Plans section is really hard to read, you might want to check that out ( http://i.imgur.com/Klvmorc.png ).
The artist looks at the robot. They were going extra slow to be cautious, but it can mimic movement pretty fast as long as you're not going across the canvas in a single stroke.
Now you're talking. I can't wait to print out nicely formed brushstrokes from a digital painting.
Something along the lines of embedding a bump map into the 2d image to give subtle depth information. Let me know if anyone needs an artist to do the art side of this.
This is cool. It reminds me of 3D printers, but just in two dimensions and with a paintbrush and paint instead of a hot-end/nozzle/filament.
You could look at the firmware used to control the stepper motors/end stops that much of the reprap community is using at https://github.com/MarlinFirmware/Marlin. This uses GCODE as well, and deals with acceleration around turns.
Yep looked into Grbl. Hadn't seen Marlin. I didn't use this for Live mode since they weren't really built for that, but they'll be used for the photo to painting part.
Since the strokes are recorded, is it translated into positional coordinates and movements? Is it possible to adjust the "code" post-recording to perfect a replicate or add changes? Cool project!