The Modelica language is intended for the modelling and simulation of complex multi-domain systems but can also be used for the classical controller design and simulation for control systems. This tutorial will use openModelica to demonstrate this by using transfer function models, step responses and PID controllers found in the standard Modelica library.
Example System for Our Controller Design
Consider the system represented by the following transfer function:
Closed Loop Response with no Controller
To do this in OpenModelica we would build the system shown in figure 1 below.
The components we need would be the transfer function block (for the plant), the feedback summation and the step response source. The locations of these are shown in the menu breakdowns shown in figure 2 below.
Drag and drop the points to make the connections then double click the components to change the parameters. In this case we change the parameters of the transfer function to match the plant and leave everything else at default. You can right click the transfer function and change its attributes to make it say “plant” instead of “tranferfuntion1”.
The parameter window for the plant is shown in figure 3 below.
We now setup the simulation to have an end time of 0.05 seconds, then run it.
Now we go to the plotting window and the select the ‘y’ variable of our plant, which represents it’s output. This is the closed loop step response and is shown below.
Add a PID Controller
Now we can add the controller. This is a PID controller and can be added from the same menu as the transfer function. We insert the controller in the loop as shown in figure 6.
The PID controller is represented in the standard form and the parameters we will use are Kp = 0.309, Ti = (0.309/4.5) and Td = (0.0006/0.309), these were already determined from tuning. The step response of this new system is then:
With this basic setup you can experiment with different plants and different controller parameters and different control laws as you carry out your controller design.
Don’t forget to share with interested friends and colleagues. Happy coding 🙂
Speaking at the Box developer conference on April 22, 2015, Eric Schmidt reveals that Google has a medium term strategy to make tangible big returns from the robotics technology it has acquired and continues to develop, and expects makes these returns on its investment in the next 5 to 10 years.
As noted in this article, he didn’t have much to say but he did kinda reveal Google’s economic view on robot technology. He also highlighted that he thinks that a lot of the technology used in robots outperform comparable human senses and that robots may be more capable of performing some tasks than humans, this when he was talking about Google’s role in bringing the self-driving car to reality.
Maybe we are reading too much into his comments but they do seem to give some insight into how Google might be approaching robotic development in the future, focus on tasks that are real social (such as drunk driving) or economic (such as high labor costs) problems and develop robots to solve them. Necessity is the mother of invention and if it is necessary for most people then the invention will be very valuable economically.
Share this article with your friends and keep an eye out as we continue to see how robots will integrate (or not) into society.
Mobileye earlier this year announced the EyeQ4 system on a chip (SOC), their 4th generation camera-based Advanced Driver Assistance Systems (ADAS) SOC for the automotive industry. The EyeQ4 has huge amounts of computing power and consists of 14 computing cores out of which 10 are specialized vector accelerators with extremely high utilization for visual processing and understanding. All this is necessary for it to analyze the huge amounts of data pouring from the cameras and sensors on self-driving cars and do what is required to keep these autonomous vehicles driving safely and efficiently.
For all this computing muscle the EyeQ4 uses only 3 watts of power, less than the power used by many mobile phone application processors. The EyeQ4 would also support sensor fusion with radars and scanning-beam lasers if present on the cars. This adds capability to Mobieye’s list of products that have been used by major automakers (BMW, Volvo, etc) for years for such this as lane departure warning, speed limit indication, etc.
This kind of technology may be the way to go for general robot design (Yes, self-driving cars are robots), we as humans rely primarily on our eyes for receiving the data we use for general navigation so maybe our robots should as well.
For this we do have the kinect, Microsoft’s motion sensing platform originally intended for Xbox gaming. This sensor has now been used as a general visual input aid to various applications including robotics. Check out the videos below.
What we may need however is a genuine low cost open hardware solution so that the tech can be more widespread and can be accessed by more people. This could help to advance its application to robotics by fueling hobbyist and undergraduate use of this advanced technology.
We may have a start by using the raspberry pi and the NoIR camera. Let’s see where this goes.
What we definitely need are lower costs navigation sensors so our hobby robots can have more intelligence and hence be more useful.
Line tracking is probably the most widely used navigation system for robots especially in student level robot competitions. This robot though will most likely shock all those students with its sheer speed.
Check out this hyper fast line following robot and see if you and your friends can build a robot to match its speed.
Line tracking requires a contrasting line on the floor and therefore is not the ideal method for general robot navigation but it’s a good start to get to know the basics of robot navigation and these basics can be applied to more complex technologies. Happy coding.
This video shows how to make a simple BEAM walking robot. Very good introduction to the technology, maybe you can advance the technology and bring it to mainstream.
Try this if you want to explore BEAM technology. If you are an advanced builder you could try to mix this BEAM technology with regular digital robotics technology.
Share with your friends using the buttons below and consider building them together using parts from Amazon, follow the links below to Amazon and support us.
This is a one motor powered walking robot that really resembles an insect, more nature inspired stuff. The design reminds me of BEAM (from Biology, Electronics, Aesthetics and Mechanics) robots that use a kind of analog system (Instead of the common digital systems) that was modeled from the way our nervous systems work and commonly used to make insect robots like this one.
This BEAM design concept can give the robot relatively advanced behavior with simple components and no programming. I came across the beam concept during my first love affair with robots and now I think I may have to go and re-research the tech and see if it has been advanced or taken up by any of the big boys.
I really liked the concept because it seemed like something that a futuristic artificial intelligence would do to create little helper drones, at least the navigation and basic ‘reflex’ behavior for these drones. From what I’m remembering now, using the BEAM concept would allow the ‘nervous system’ of these robots to be made from relatively simple components and require no programming at all. More info to come so stay tuned.
Share this post with your friends using the buttons below to make them aware of this alternate robot technology ;).
This is an ice cream stick robot:
This instructional video will show you how to make it. Very cool, a good weekend project perhaps? It can also can serve as a reasonable introduction to electronics and robotics technology.
Buy your parts on Amazon by through the links below and share with your friends and make them together, probably have a race 🙂
Don’t forget to share this with your friends and followers using the buttons below and you can check out the very very cool Lego Mindstorms kit below, you can also view similar posts for more cool robot stuff.
BionicANTs, developed by Festo automation to showcase their research and development efforts leave no doubt as to where their design inspiration comes from, ants.
These robotic insects are more than just giant versions of real world ants,
Since being acquired by Google, the Boston Dynamics team has advanced their world famous robot mule. The most noticeable change must be in the power source, it appears to now be fully powered by batteries as the whine and smoke of the engine are now gone.
This technical advance again illustrates the remarkable strides we are making with robot development and I would really like to have a look at the systems in this robot. The engineering designs should be awesome. We want open source 🙂
Spread the word of this tech advance by sharing this post with your friends using the buttons and check out cool robots on Amazon using the links below.