Introduction:
Robotic interface, simply put, is a gateway between a person and a robotic product. These interfaces could be as simple as an automated voice system over telephones to as sophisticated as an actual human-like robot.
Humanoid Interfaces
An entertainment robot like Sony Giro provides various interfaces for HCI, such as voice, tactile sensation and vision. This allows it to interact with human in a natural way (for example, a person might "order" Qrio to salute by voice). The follow video about Qrio talking with kids demonstrates its highly humanoid interface:
Importance of Reliable Robotic Interface Design
When it comes to industrial and military applications, pleasure becomes less of an importance. In these systems, safety, reliability and efficiency become the paramount. Robotic interface designers have to be extremely cautious. The famous silicon valley CX30 "killer robot" case shows the importance of correct design of robotic interfaces.
Robotic Interfaces Used in Daily Life
Robotic devices have been used to help severely disabled personals. Physicist Stephen Hawking has been able to move and speak after developing and being almost completely paralyzed by amyotrophic lateral sclerosis (ALS), with the help of his specially designed wheelchair, which is essentially a robot.
Stephen Hawking now relied on his cheek to control the wheelchair and the voice synthesizer mounted, which turned out inefficient. During a TED conference talk, it took him 7 minutes to answer a question.
A Possible Future of Robotic Interfaces
One of the research direction for robotic interfaces is brain-robotic interfaces, which aims at interaction with robotic products directly by human brain waves. This allows intuitive interaction and manipulation of robotic products. Although still far off, such technology may become reality someday and completely change our everyday life.
Conclusion
Robotic interfaces, seemingly remote, is already part of our life. Widely used in industrial, commercial and medical applications, it slows changes the way we live and perceive the world. The future of robotic interfaces, will focus on natural (humanoid) and intuitive interaction, to be realized through lower-level human functions such as voice, gesture and even brain waves.
References:
Robotic Interface, http://www2.mse.vt.edu/inamm/RoboticInterface/tabid/864/Default.aspx
Stephen Hawking, http://en.wikipedia.org/wiki/Stephen_Hawking
The Future of User Interfaces, http://sixrevisions.com/user-interface/the-future-of-user-interfaces/
The 'Killer Robot' Interface, Horace Gritty, http://ethics.csc.ncsu.edu/risks/safety/killer_robot/killer_news5.html
Wednesday, 21 September 2011
Saturday, 17 September 2011
Team 18 Project Proposal
Introduction:
There are various different kinds of events in NUS everyday, such as career talk, student events. To organize such events, organizers usually need to:
The purpose of our project is to design a web-based application to ease the flow of event organization in NUS, from date, venue selection to event registration. With the help of our application, the event organizer will be able to arrange an event simply by clicking his mouse in front of a computer.
Literature Review:
There are several similar event management applications that have already be implemented. It is pretty sure that an NUS event can be organized using those applications. However,there are mainly two drawbacks in those applications.Firstly, most of such applications have a very complex user interface. The picture below is a screen snapshot of an event application.
Although reasonably well designed, there are unnecessary options which makes users tired of reading. Our goal is to make the application specific for events in NUS; to remove unnecessary functions and excessive complexities so as to help users achieve higher efficiency.
Functionalities with sketches of interface:
Venue & Date selection and Booking:
Overall, we are going provide users four functions: Venue & Date selection and booking, program design, budget management and invitation management. The four steps are listed on the left side of all our interfaces so that the users will directly know that what is the flow of our application and users will be able to go to any step by clicking on these four button, which will give users maximum freedom.
The first function we provide is Venue & Date selection and booking. The venue is grouped by faculties. This is a reasonable way of grouping since different kinds of activities may taken places in different faculty. For example, industries talks are most like to be taken in Engineering faculty, science report may be taken place in science faculty. Once the users has chosen the venue. The information of the venue will be displayed such as the location, capacity and facility.
The date selection is implemented as a calendar. Users can make their decision simply by clicking on the day in calendar. Once he selects the date, available time slots of the day will be displayed.
Program Design:
The second function we provide is program design. Some programs, such as lunch, closing speech, etc., are by default displayed in the "table". Users can drag and drop those programs into the event's timetable on the right side. Of course, user can add more activities into the "table" by clicking the plus symbol on the top up corner of the "table".
Budget Management:
The third function we provide is budget management. Users can enter description of an item (including item name and cost) and add it to the expenses table on the right side. The all the expenses are summarized in the expenses table. And user can generate a budget report using our application (not shown on the sketch above).
Invitation Management:
Strength of each team member:
Christine Lisun: Design interface
Xu Yecheng: Programming ability.
Xiang Yongzhou:Good team work ability.
Huang Cheng::Programming ability.
Our strength lay in different fields. We could help each other in this project and make it a successful application.
There are various different kinds of events in NUS everyday, such as career talk, student events. To organize such events, organizers usually need to:
- select a date and book the venue
- design the event flow
- send invitation email to potential participants.
- gather response from targeted participant group
The purpose of our project is to design a web-based application to ease the flow of event organization in NUS, from date, venue selection to event registration. With the help of our application, the event organizer will be able to arrange an event simply by clicking his mouse in front of a computer.
Literature Review:
There are several similar event management applications that have already be implemented. It is pretty sure that an NUS event can be organized using those applications. However,there are mainly two drawbacks in those applications.Firstly, most of such applications have a very complex user interface. The picture below is a screen snapshot of an event application.
Although reasonably well designed, there are unnecessary options which makes users tired of reading. Our goal is to make the application specific for events in NUS; to remove unnecessary functions and excessive complexities so as to help users achieve higher efficiency.
Functionalities with sketches of interface:
Venue & Date selection and Booking:
Overall, we are going provide users four functions: Venue & Date selection and booking, program design, budget management and invitation management. The four steps are listed on the left side of all our interfaces so that the users will directly know that what is the flow of our application and users will be able to go to any step by clicking on these four button, which will give users maximum freedom.
The first function we provide is Venue & Date selection and booking. The venue is grouped by faculties. This is a reasonable way of grouping since different kinds of activities may taken places in different faculty. For example, industries talks are most like to be taken in Engineering faculty, science report may be taken place in science faculty. Once the users has chosen the venue. The information of the venue will be displayed such as the location, capacity and facility.
The date selection is implemented as a calendar. Users can make their decision simply by clicking on the day in calendar. Once he selects the date, available time slots of the day will be displayed.
Program Design:
The second function we provide is program design. Some programs, such as lunch, closing speech, etc., are by default displayed in the "table". Users can drag and drop those programs into the event's timetable on the right side. Of course, user can add more activities into the "table" by clicking the plus symbol on the top up corner of the "table".
Budget Management:
The third function we provide is budget management. Users can enter description of an item (including item name and cost) and add it to the expenses table on the right side. The all the expenses are summarized in the expenses table. And user can generate a budget report using our application (not shown on the sketch above).
Invitation Management:
The last function we provide is invitation management. Sending invitation email is integrated into our application. If user need to get RSVP from potential participants, he can click Request RSVP button. User can use our application to design RSVP form as well.
User can add different fields they need into the RSVP form and a preview of the RSVP form will be displayed on the right side of the screen.
If potential participants reply, our application will be notified. Then the organizer may view RSVP statistics using our application. Moreover, users can export the statistics to excel.
Application on NUS officer side:
Once an application has been made, our system will generate an email to NUS officer side. Then, the officer can log in to our system as a facilitator to accept or decline the application. Furthermore, there will be specific interface in our application for facilitators to view/edit bookings of facilities under his/her management.
Assumptions:
NUS will to provide us with an API to access NUS database, such as NUS email and venue availability database.Strength of each team member:
Christine Lisun: Design interface
Xu Yecheng: Programming ability.
Xiang Yongzhou:Good team work ability.
Huang Cheng::Programming ability.
Our strength lay in different fields. We could help each other in this project and make it a successful application.
Friday, 2 September 2011
Interaction in wayFinding applications
Introduction
In this blog post, we are going to discuss about different kinds of interactions in wayFinding applications.
1. Overview
The term interaction comes from Human-Computer interaction. Instead of defining what is interaction, we need to note that the important thing is that the user is interacting with the computer in order to accomplish something(http://mysite.verizon.net/resnx4g7/PCD/WhatIsInteractionDesign.html).
In wayFinding application, user's goal is to reach his/her destination. Currently, there are significant number of wayFinding applications available to us. For example, GoogleMap, AR Navi NUS, and Wikitude Drive (Augmented Reality). We will try to assess each of these wayFinding applications in terms of user interaction.
2. Google maps
The most widely used wayFinding application over the internet and mobile devices would be the Google Maps. The interface of Google Maps is simple and user friendly. Users can view the map of the whole city, and learn the exact location of themselves in the city. Simply key in the start point and end point you want, the Google Maps would highlight a suggested route for you. Furthermore, Google Maps can also show you nearby restaurants and gas stations which can be very convenient to us.

2. AR Navi NUS
AR Navi NUS is a new indoor wayFinding application in NUS which implements Augmented Reality (AR) technology to give a better user interface. It gives us a 3D image of NUS buildings so that users can locate their destination within the buildings easily. However this technology is not mature yet, there is a need for the users to correct their current locations with markers spread all over the campus.
http://www.youtube.com/watch?v=dxyT7Bu_KvE
3. Wikitude Drive (AR)
Wikitude Drive is a new wayFinding application for drivers and pedestrians. In terms of its interaction with users, Wikitude offers the most natural interaction. Users do not have to read a map, just follow the line. Users never have to take their eyes off the road so it is safer. Lastly, it is simply more fun and cool! It can also change to pedestrian mode and the interface is also the same as in driving mode.
http://www.youtube.com/watch?v=g-0cuqeUvCQ
4. Future wayFinding Applications
In the future, when AR technology becomes more mature, optical see-through display can be used in one-for-all wayFinding application. User simply wear the optical device and see the real world overlaid with virtual objects directing them to their destination. With speech recognition, user can simply voice out where they want to go and the device will show the direction directly on top of user vision of real world. In our opinion it would be the most natural way to interact with user for navigation.

optical see-through display
http://www.se.rit.edu/~jrv/research/ar/introduction.html
Conclusion
As we have discussed above, we can see different types of wayFinding applications, such as indoor, outdoor, driving, walking wayFinding applications. Each of them offers different user interactions and is suitable for a specific situation. There has not been a one-for-all wayFinding application up till now. In the future, however, with advancements in HCI such as AR technology and voice recognition, wayFinding will be more natural, convenient and accurate.
In this blog post, we are going to discuss about different kinds of interactions in wayFinding applications.
1. Overview
The term interaction comes from Human-Computer interaction. Instead of defining what is interaction, we need to note that the important thing is that the user is interacting with the computer in order to accomplish something(http://mysite.verizon.net/resnx4g7/PCD/WhatIsInteractionDesign.html).
In wayFinding application, user's goal is to reach his/her destination. Currently, there are significant number of wayFinding applications available to us. For example, GoogleMap, AR Navi NUS, and Wikitude Drive (Augmented Reality). We will try to assess each of these wayFinding applications in terms of user interaction.
The most widely used wayFinding application over the internet and mobile devices would be the Google Maps. The interface of Google Maps is simple and user friendly. Users can view the map of the whole city, and learn the exact location of themselves in the city. Simply key in the start point and end point you want, the Google Maps would highlight a suggested route for you. Furthermore, Google Maps can also show you nearby restaurants and gas stations which can be very convenient to us.
2. AR Navi NUS
AR Navi NUS is a new indoor wayFinding application in NUS which implements Augmented Reality (AR) technology to give a better user interface. It gives us a 3D image of NUS buildings so that users can locate their destination within the buildings easily. However this technology is not mature yet, there is a need for the users to correct their current locations with markers spread all over the campus.
http://www.youtube.com/watch?v=dxyT7Bu_KvE
3. Wikitude Drive (AR)
Wikitude Drive is a new wayFinding application for drivers and pedestrians. In terms of its interaction with users, Wikitude offers the most natural interaction. Users do not have to read a map, just follow the line. Users never have to take their eyes off the road so it is safer. Lastly, it is simply more fun and cool! It can also change to pedestrian mode and the interface is also the same as in driving mode.
http://www.youtube.com/watch?v=g-0cuqeUvCQ
4. Future wayFinding Applications
In the future, when AR technology becomes more mature, optical see-through display can be used in one-for-all wayFinding application. User simply wear the optical device and see the real world overlaid with virtual objects directing them to their destination. With speech recognition, user can simply voice out where they want to go and the device will show the direction directly on top of user vision of real world. In our opinion it would be the most natural way to interact with user for navigation.
optical see-through display
http://www.se.rit.edu/~jrv/research/ar/introduction.html
Conclusion
As we have discussed above, we can see different types of wayFinding applications, such as indoor, outdoor, driving, walking wayFinding applications. Each of them offers different user interactions and is suitable for a specific situation. There has not been a one-for-all wayFinding application up till now. In the future, however, with advancements in HCI such as AR technology and voice recognition, wayFinding will be more natural, convenient and accurate.
Subscribe to:
Comments (Atom)









