I found a good spot, got out the tablet, shaded the screen with my body and made a start, timidly at first but I got brave and sketched lines and coloured over them with the highlight marker brush then back for more lines. Suddenly about 20minutes had slipped by and what I had done looked ok. I moved under the shade of a nearby tree and it looked better, ok better for my first outdoor attempt. It was after all just a field sketch (see above).

Today I stitched together 3 photos I took of table rock from the same vantage point and thought I should try improving the colouring (and tone of my sketch). It didn’t take long to realize this would be a perfect little test for the Neural Style feature of google deep dream generator. What I tried out was to use my bamboo paper sketch as the neural training image to filter my photographic panorama BUT to borrow the colour scheme from my photo rather than the sketch. In other words adopt the line work and flat highlighter style shading.from my sketch but stay close to the photographic colours (and tone). I must say I was impressed. I’m not sure how to take this AIart (Artificially Intelligent art) further but it does produce something with the sense of my touch and the way the place felt, something excitingly worthwhile.
No comments:
Post a Comment