[CNIT 581-SDR / Spring 2019] Week 16 – Final

These days, many people use smart devices. As a result, many people easily paint with electronic devices. For the convenience of our users, we enabled robots to draw pictures through electronic device applications. We designed a new robot, developed an application, and controlled the robot by following the coordinates obtained from the drawings drawn in the application.A. System ArchitectureTo implement an interactive robot drawing system, we setup a mobile robot system, which consists of two parts. The first one is that possible user draws a line or a shape in a mobile application, selecting the size of a drawing. The information of the drawn trajectory saved into a tuple list and will be saved into a JSON format file so that the mobile application and robot system can share the user’s drawing.Considering a robotic system that can draw some picture anywhere, Hamster robot platform is selected. The physical robot system consisting of two mini robots calculates a new path on physical spaces to let the robot draws by reading the data. The mobile robot system reads calculated trace throughBluetooth connections so that robots can move. We also use a simple socket communication to access data created on the mobile device beyond the restricted area of iOS application.

Design of RobotWe designed our robot by moving two robots and attaching the pen on the center between two robots. One robot can be used for our project instead of using two robots if the pen can be hold on the center of robot’s wheels. The robot can leave the trajectory while it is turning right or left if location of penis not on the center of wheels.Both mini mobile robots with a pen attach in between the wheels can sketch anything user drawn on the mobile devices platform with ease. Instead of integrate servo motor to control a pen, we used a 3D printed bracket to hold the pen for simplicity in the project. The complication from utilize mini servo motor with the Hamster mobile robot are due tot he power needed to activate servo motor are 4.8v but the battery of the Hamster robot supply 3.7v. There are only two ports supported by the mobile robot where the mini servomotor used three ports. To control the Pulse Width Modulation(PWM) through Hamster mobile robot with the limitation of ports available are a difficult task in tuning to get the correct angle for servo motor movement.The initial stage of bracket design are loose and affected the movement of the robot wheels and the pen drawing direction.Different 3D brackets are designed for product testing to hold both Hamster robot with the consideration of tilted outer wheels and the correct hole size to hold the one size pen without material deformation from the Polylactic Acid (PLA)bracket.

We built a simple mobile iOS application to provide user to draw lines on the display. There is a white space with five buttons as Fig 3 shows. The three buttons below the white drawing area are for giving information about picture size. If the user touches the mobile display to draw something, black lines appear following users’ gestures. After the application finishes, and clicks ’send’ button, the stacked pixel coordinates become one of the components of the JSON file. The reason that the JSON structure is chosen is that the robot drawing system requires sizing and coordinate information together.This file structure allows the programmer to easily access the desired data using variable names.We have found that there is a significant difference between our expectation and real results of coordinate structure. This is because the pixel system does not have all the series of coordinates for the given input of the user. Instead of having all the coordinates, they have a gap between each coordinate as the set of Bezier path segments. This path has a start point and a motion point for each sub-path, which is also affected by the angle of the path the user has drawn. The greater the change in the curve angle, the narrower the gap between the subpaths become. Since our robot system calculates turning angles between the current original coordinate and new target position, a large burden can be generated when sub path shaving a narrow gap between them are given. Therefore, while cleaning the coordinates marker such as ’move to’ and ’quad to’ flags from bezier path, we have been working to simplify coordinate values at regular intervals.

There was an issue of accessible data to transmit theJSON data to our robot system. We could not get created the structured file from another device because the iOS environment does not allow each application to access other system resources outside of the application area by using the concept of a sandbox. As an alternative, there were two possible solutions: socket communication and the creation of a representational state transfer application program interface(RESTful API). We chose the socket communication because our project has been done in a small area, and we are using our local laptop for Bluetooth communication, thus creating RESTful API through HTTP requests was redundant. Therefore, we utilized localWi-Fi environment to transmit the JSON file from the tablet to a local device for python-based robot system, opening a port to use Transmission Control Protocol. (TCP)

The movement of robot is divided into two methods, changing its heading position and moving forward following coordinates. The control system reads the series of coordinates of drawing stored in JSON file sent from the application. With current coordinates and target coordinates, the robot can get the heading angle and distance to move. After calculating the heading angle and distance, the robot turns its heading angle coordinate and moves a given distance towards the target coordinates.

θ= arctan(y′−y, x′−x)

The heading angle can be calculated by using arctan. The coordinate (x’, y’) is target position and the coordinate (x, y) is the current position. The turning angle the robot should move is difference between current heading angle and new heading angle. The set of robots turn in place with turning angle which is converted to degrees from radians. After calculating the turning angle, the robot decides the turning direction. The robots turn right if the angle is in [0, 180] while it turns left if the angle is in [-180,0].dI:dO=vrI:vrO

vrI=−vlI,vrO=−vlO

The direction of robots should be opposite and outer most wheels should faster than inner wheels. The outermost wheels were farther than inner wheels 3.8 times from the center, so the velocity of outermost wheels were 3.8times faster than the velocity of inner wheels. dI is the distance between center and inner wheel and dO is the distance between center and outer wheel. vrI and vlI mean the velocity of each robot’s inner wheel, and the velocity of each robot’s outer wheel.If the robot turns forward to the target position, the robot is ready to move. The PID controller [10] is needed to make the robot move precisely. The robot cannot go straight because two motors could have different winding resistances such as different drive currents and torque and the floor might have varying surface friction. The robot keeps going towards to the right side, so we applied the compensation to right wheel to make it move straight. We calculated the compensation for the right wheel by using PD controller to reduce the acceleration.We tuned the movement by PD controller instead of PID controller, because using PD control has an advantage that the robot can controlled smooth and stable.V c=e∗Kp+ (e−e′)∗Kd(4)The difference between target acceleration and current acceleration to make robot move straight. e means the difference between current acceleration and target acceleration and multi-plied by the constant Kp, called the proportional gain constant.e’ means the previous acceleration error. The compensation of velocity is calculated by adding the differential of error multiplied by Kd, which is the derivative gain constant, to e* Kp. The constants Kd and Kp were set by the experiments.The robots go straight for certain time in proportion to distance between two coordinates. The size of drawing can be changed by changing the ratio between moving time and distance.


Demo Video: https://www.youtube.com/watch?v=TtvmhD8yqoQ&feature=youtu.be

[CNIT 581-SDR / Spring 2019] Week 14 – Navigation Control with PID

First of all, we tried to control the robot by using PID control.

Figure 1: PID Controller

The PID control, which is the proportional-integral-derivative controller is a control loop feedback mechanism. Inside the PID control, there are few controls which depend on the factors that we use. We can use not only all P, I, and D constants but also P and D constants to control the movement. When we tried to move the hamster robot with simply giving move_forward() command, it didn’t go straight. It turned right even there was no any input. Thus, we tried to remove that error by making the acceleration close to 0. For movement and controlling the heading angle, our team tried to decrease the error (fluctuated movement), simply putting PD control. The P and D constants are heuristic data, so we increased the number by 0.001. We found out when the P constant was 0.01, and D constant was 0.02, it showed the best result with our circumstance (friction, pen, surface, etc.). However, there were some random cases because if the hamster robot doesn’t have a lower power compared to full charged battery level, the speed of wheel slowed down and also there could be some error from holding part which is connected by tape unstable. Furthermore, because of unstable sensors, it didn’t show up the regular result.

Figure 2: Drawing straight lines                         
Figure 3: Trial to draw a rectangle

Because of the random result, we changed the method to put some compensation to the right wheel and map the angles. It showed much more accurate data compared to the result of PID control. We were trying to draw a rectangle; however, the robot drew rounded trajectory when it tried to turn to change its angle. The solutions we found are holding up the pen while it changes the angle and put the pen on the center of the robot. We chose the second method to solve this case. Since we cannot pierce the robot, we tried to build two robots and put the pen in the center of two robots.

[CNIT 581-SDR / Spring 2019] Week 12 – Navigation Control from coordinate

 

In the previous blog, we had decided to send a JSON file format to adjust the motors of Hamster robot. However, we found that the structure in the mobile environment is entirely different from our original thought. When a user drew some lines in the display, the object that receives the user’s input takes two coordinates of starting and ending points with curved information.

(https://developer.apple.com/documentation/uikit/uibezierpath?language=objc)

When we took the coordinates of drawing, it has not only information on starting and ended points but also has ‘quadto’  information which shows us lines has a geometric shape. The picture below is about a JSON file containing coordinates of two paths.

Screen Shot 2019-04-06 at 12.41.57 AM.png
Picture updated: 04/06/2019 12:58AM

 

Therefore, we refer to a blog explaining robots’ turning method. To turn right or left accurate, we need to calculate the angular velocity of the left wheel and the angular velocity of the right wheel. We need to control the position of the robot in x coordinate and y coordinate as well as its orientation.

Same speed of both wheels: go straight
The speed of a wheel is smaller: the robot turns in that direction

Screen Shot 2019-04-05 at 3.04.43 PM

(http://enesbot.me/kinematic-model-of-a-differential-drive-robot.html)

After reading it, we realized that the second assignment of SDR class is similar to our purpose so now we are examining how we will implement the manipulation of robot motors by these two references.

 

 

[CNIT 581-SDR / Spring 2019] Week 10 – Mobile Application

 

For the mobile side, we’ve built an application that users can draw some lines with the width information.

Screen Shot 2019-03-20 at 9.45.24 PM

The white square at the center of this application is the drawing area which can get users’ input. Some black lines will be drawn along with the input. The blue buttons on the bottom are for the thickness of lines so that the users can simply select the desired one. We added a reset button in the case of redrawing.

 

Screen Shot 2019-03-20 at 10.00.27 PM

The ‘Send’ button at the below is to send trajectory information to a physical robot. The data is about the relative coordinates with a starting point rather than the absolute coordinates. The drawing information would be saved by JSON type, as shown in the example below.

{'item_number' : 3, 
 'infos': {
  {
   'starting_point' : [149.0,30.8], 
   'thickness' : 1
   'coordinates' : [[A,B], [C,D], [E,F], ...]
  },
  {
   'starting_point' : [167.0,20.3], 
   'thickness' : 2
   'coordinates' : [[A',B'], [C',D'], [E',F'], ...]
  },
  {
   'starting_point' : [195.0,16.2], 
   'thickness' : 3
   'coordinates' : [[A'',B''], [C'',D''], [E'',F''], ...]
  }
 }
}

Depending on the receiver point, data structure and each item can be changed.

 

[CNIT 581-SDR / Spring 2019] Week 8 – Automated Drawing Robots

We wrote the proposal report and did presentation. The following is about the summary of proposal for our project.

Abstract—Although many areas with human physical labor
have been replaced by high edge technologies, drawing is not fully
replaced by robots or electric devices. People are still exposed
to dangerous environment such as paint toxicity or car accident
when they draw the road sign. Thus, this proposal proposed
the replacement of drawing using robots controlled by mobile
devices.

Literature Reviews:

A. The Artist Robot: A robot drawing like a human artist

According to G. J-Pierre and Z. Said [1], the artist robot
are designed to draw portraits and behave like a human artist
with a 6 dof industrial manipulator.

B. Watercolour Robotic Painting : a Novel Automatic System for Artistic Rendering

Scalera et al. [2] developed a watercolor painting automatic
system where the robot Busker render an image and paint
watercolor painting with different brush techniques

C. Autonomous Indoor Robot Navigation Using Sketched Maps and Routes

In 2016, Boniardi et al. [3] developed a system which uses
hand drawn indoor map and trajectory on mobile devices
for a robot to follow.

Methodoloy:

First of all, users can draw tracing on the mobile application’s surface. This application has functions, where it stores
the trajectory of input and the users can select the thickness
of their drawing, so that represents the brightness. Regarding
the robot part, it mainly should be able to modify its motor’s
angle to follow the given trajectory. While moving, a motor
for drawing material on the robot can adjust the degree of
a contact point between the pen tip and drawing surface.
The mobile application and the robot should be connected to
the computing side which can calculate coordinates regarding
scale difference between mobile and physical environment.

EQUIPMENT LIST:
Mobile Device, Robot Hamster, ROS

http://hamster.school/en/reference/

Project Schedule:

Schedule of this Project

References:

[1] L. Scalera, S. Seriani, A. Gasparetto, and P. Gallina, “Watercolour Robotic
Painting: a Novel Automatic System for Artistic Rendering,” 2018

[2] F. Boniardi, A. Valada, W. Burgard, and G. D. Tipaldi, “Autonomous
Indoor Robot Navigation Using Sketched Maps and Routes,” Tech. Rep.

[3] G. Jean-Pierre and Z. Said, “The artist robot: A robot drawing like
a human artist,” in 2012 IEEE International Conference on Industrial
Technology, ICIT 2012, Proceedings, 2012.

Create your website with WordPress.com
Get started