Thursday 21 April 2011

Retired this blog, join me at nice new address:http://nicenewblog.blogspot.com/2011/04/nice-new-blog.html

http://nicenewblog.blogspot.com/
Yes nice new blog at the above address, sorry Phil - the old one had to go.
Goodbye old followers - why not join me in a whole new world of post-doctoral lovelyness, new design schema, hopeful typeface, uncynical and shiny...spring cleaned, fresh and fragrant..

Friday 15 April 2011

Corrections accepted

Corrections accepted, phew...not sure what to do with this blog now....retire it and start a nice new shiny one?

Tuesday 29 March 2011

Passed PhD viva with minor corrections..
:-)

Friday 25 March 2011


Lee and I have had our paper accepted for the Istanbul ISEA 2011 in Spetember, the paper is about our VAINS collaboration. No idea how we are going to fund it...we might have to hitch-hike and sleep in ditches. But it's pleasing to get through, apparently there were thousands of applicants, so that's a bit of validation for our project.

"ISEA2011 Istanbul is the international festival of new media, electronic and digital arts. The 17th International Symposium on Electronic Art, a leading world conference and exhibition event for art, media and technology, is scheduled for September 14 to 21, 2011 in Istanbul, Turkey. The ISEA2011 Istanbul exhibition will coincide with the Istanbul Biennial and will provide a fantastic opportunity to showcase contemporary new media arts. "

My viva rehersal on Monday seemed to go ok, but I've got to work on making it clearer exactly how the software works with the book and egg. No one liked my film much so I dont think I'll bother to show it, probably wont have time anyway..

Wednesday 23 March 2011

South film


Other places to load South content, just as good on a kindle or mobile phone...

Our beautiful robot

So today we built our first Buridan's Robot prototype, now it responds to light, next we will give it conflicting desires. Below very simple Urbi code to make it communicate when light is a certain brightness:






















load("/home/student/creativerobotics/urbi-for-bioloid/dynamixel.u");

 var Global.d = Dynamixel.new;

class Thing
{

function init()
{
/* var this.shoulder =
Dynamixel.Device.new(d,1,Dynamixel.DeviceDesc.AX12);
var this.elbow =
Dynamixel.Device.new(d,2,Dynamixel.DeviceDesc.AX12);
var this.wrist =
Dynamixel.Device.new(d,3,Dynamixel.DeviceDesc.AX12);
var this.hand =
Dynamixel.Device.new(d,4,Dynamixel.DeviceDesc.AX12);
*/

var this.sensorH = Dynamixel.Device.new(d,100,
Dynamixel.DeviceDesc.AXS1);
var this.sensorRead = 0;



};

function start_feedback(){
detach
({
every(0.1s)
{
this.sensorRead = this.sensorH.lightLeft;
}
});

};


function startReact(){
whenever(sensorRead10)
{

echo("lighjt");}
};



};
var thing2 = Thing.new;
thing2.start_feedback;

thing2.startReact;


Wednesday 16 March 2011

Creative Robotics Basic plan for Buridan's Robot a robot that needs help to make choices


Basic plan for Buridan's Robot a robot that needs help to make choices
//we are finally getting somewhere with Urbi in Creative Robotics

loadModule("urbi/qt-for-urbi");
load("thing.u");

var thing = Thing.new;

function conv(a){

var res = ((a* 3.4/100.0) -1.7);
res
}|

function gui_stuff()
{
var window = Qt.Widget.new;//create a window
var layout = Qt.GridLayout.new;//create the layout to display the
// things in the window
window.setLayout(layout);//set the layout in the window !notice the comment is longer than the line ....

//here you create the sliders
var shoulder_slider = Qt.Slider.new;
var elbow_slider = Qt.Slider.new;
var wrist_slider = Qt.Slider.new;
var hand_slider = Qt.Slider.new;


//here you add the stuff to the layout
layout.addWidget(shoulder_slider, 0, 0, Qt.Alignment.AlignHCenter);
layout.addWidget(elbow_slider, 0, 1, Qt.Alignment.AlignHCenter);
layout.addWidget(wrist_slider, 0, 2, Qt.Alignment.AlignHCenter);
layout.addWidget(hand_slider, 0, 3, Qt.Alignment.AlignHCenter);


//the following is how you catch an event, notice the event?(var msg) construct
at (shoulder_slider.sliderMoved?(var msg))
thing.shoulder.targetPos=conv(msg);


at (elbow_slider.sliderMoved?(var msg))
echo(msg);


at (wrist_slider.sliderMoved?(var msg))
echo(msg);


at (hand_slider.sliderMoved?(var msg))
echo(msg);


window.show;//here you make the window visible!!!!

};
//hmm interetsing http://support.robotis.com/en/product/bioloid/beginnerkit/download/bioloid_beginner_app.htm
this is our simple idea:



react to sound - wants to turn to sound - we have a sound sensor

react to light - wants to turn to light - we have a light sensor

if it detects both it will signal need for our support how - by moving head - signs of panic

we might use attack duck example - simple movements


we activate proximity sensor to give it permision to make a random decision


decision( Tag)