A rather challenging but enjoyable task I had to work on recently was the development of a manifesto in a creative way. At first the task sounded easy, but in fact in required a lot of thought. Really I needed to summarise what my artistic intentions are.
This was an invaluable exercise for the start of my MA. It required me in sum up and state publicly my intentions, which was easier said than done. I developed my thought process and refined my ideas through the use of my sketch software on my iPad – Paper (great simple note/sketching app which is beautifully presented). On top of this difficult task I also needed to develop a way of presenting this.
My manifesto is as follows:
- Freedom of expression
- Stand up for what you believe in
- Use what you have to its full potential
- Use technology in innovative and creative ways
- Visual, auditory & kinesthetic
- Technology levels the playing field
- Develop my own style, believe in myself and be creative
- Develop the link between technology and art
- Use my technical skills more creatively
- Express myself, for myself
After some thought it would be good to combine some of my wide variety of skills and develop an augmented reality Manifesto. After further research I decided to use an amazing app called Aurasma. I signed up for developer status, which is easy and free. The process was slightly complicated, but not overly difficult as long as you have basic technical skills.
After developing my Manifesto I recreated each bullet point in Maya so I would have some 3D text. This was a tough call as I have only previously used Cinema 4D – although the principles are the same, the software is laid out totally differently! After a couple of frustrating nights developing models in Maya, applying the correct lighting (very important!) and exporting in the very specific version of the .dae format, I had my models.
For each one i created a simple synthesised sound in Logic Studio which plays in the background when each point is displayed. These files needed to be all combined within a specific compressed format (.tar) with a 300 x 300 icon. If this wasn’t done correctly it would not work properly. This part required a further week of refinement and experimentation to get everything working correctly.
A required a series of images to trigger the “aura” as it is called in the app, or the 3D image. I chose 12 images which have a variety of meaning to me and represent in some way each of the points.
Hopefully in the slideshow below you can see the results. I have to say that I was incredibly happy with the results of this experiment. The tracking when you move the iPad/iPhone camera round and up and down the image is excellent. If I was to have a little more time it would have been great to figure out how to add coloured textures to the objects – which had me stumped and I had to give up on to hit the deadline with this. It would also be great to add animation to the text – maybe in the future!
I would post a link to my shared Aurasma stream – for which you can download the app for free, then link it to my work. The unfortunate thing is that you also require my trigger images – which I won’t be posting. The slideshow below shows some static images – I will probably upload a video demo in the not too distant future.
I was quite happy with the positive feedback I received when I displayed this to my fellow classmates. It seemed to be very engaging and people were quite intrigued by it. As always, any feedback is appreciated!