Idoru.js is a UX-forward framework for creating synthetically charming artificial characters in online virtual worlds.
This is a default Idoru. It has the most simple avatar ( 9 spheres and a cone ), the most simple body language ( 5 smooth controls ), and the most simple AI ( a rudimentary Eliza-style chatterbot ) that a thing could have, and still be considered an artificial character with synthetic charm. At this point, its charm is nothing more than eye contact, head tilting, an infinite range of subtle facial expressions, plus automatic breathing and blinking, all of which combine to make you feel more connected to it than you possibly could to any static virtual entity.
An idoru would make an excellent starting point for virtual teachers, tour guides, public speakers, comedians, whatever - just add a custom avatar and a "lesson plan", and the idoru pays attention to the user, exhibits body language to establish a connection and to keep them engaged, follows their movements while respecting their personal space, pauses the lesson plan to field questions, and tracks their movements and actions for later analysis.
( Currently, it's collecting data on how long users stay on this page, when the avatar appears to the left or to the right of the chat window. Once we know which encourages the most user engagement - it'll move to that side for good and start collecting data about something else. )
This is not an avatar containing an AI - this is an artificial character, and like you, it controls its avatar like a marionette. You could hook it up to an avatar people use, and if the AI was good enough, people wouldn't know it wasn't real. You couldn't do that with a real-life robot - but in virtual reality, even though it's artificial, its "form" would be no different than yours.
This is a very rough prototype. If you want to know more - go ahead and ask it.
Ask it anything. It's a lot like chatting with a real person who has a few screws loose.
For more information, visit 3dspace.com
First uploaded March 31, 2016. Updated April 15, 2016.