This paper is the continuation of a work presented at ICORR 07, in which we discussed the possibility of improving eye-hand coordination in children diagnosed with this problem, using a robotic mapping from a haptic user interface to a virtual environment. Our goal is to develop, implement and refine a system that will assess and improve the eye-hand coordination and grip strength in children diagnosed with poor graphomotor skills. A detailed analysis of patters (e. g., labyrinths, letters and angles) was conducted in order to select three very distinguishable levels of difficulty that could be included in the system, and which would yield the greatest benefit in terms of assessment of coordination and strength issues as well as in training. Support algorithms (position, force, velocity, inertia and viscosity) were also developed and incorporated into the tasks in order to introduce general computer assistance to the mapping of the user's movements to the computer screen without overriding the user's commands to the robotic device. In order to evaluate performance (given by %accuracy and time) of the executed tasks, a sophisticated evaluation function was designed based on image analysis and edge detection algorithms. This paper presents the development of the haptic tasks, the various assistance algorithms, the description of the evaluation function and the results of a study implemented at the Motor Development Clinic at Cal Poly Pomona. The results (Accuracy and Time) of this function are currently being used as inputs to an Intelligent Decision Support System (described in [5]), which in turn, suggests the next task to be executed by the subject based on his/her performance.