Join my mailing list

Latest Tweets

    After Effects CC has the ability to sync Shortcuts and Preferences and so on, but no native support for sync’ing Scripts across multiple machines.  These days I seem to find myself adding a new script (usually from every week or so, and I became fed-up with having to re-install scripts on my laptop that I had downloaded on my workstation, or vice versa.  Scripts are usually very small files so they’re an ideal candidate for roll-your-own cloud sync’ing.  I’ve been using this technique for a while now and it works well.

    Here’s how:

    Firstly you need a fast, reliable Cloud storage platform. My preferred option here is Dropbox because it synchronises changes much quicker than alternatives like Copy or OneDriveDropbox’s free 2GB is more than enough for a few hundred scripts.  Next, you need to install SymbolicLinker. Symbolic Links are a bit like Aliases, but implemented at the UNIX level of the OS, and they seem to work better for this kind of thing.

    1. Copy your Scripts folder from within After Effects to Dropbox and put it somewhere you’re happy with - you can’t move it later without breaking things so make sure your parent folder setup is what you want.  

    2. Right-click on your new Dropbox Scripts folder and go to Services>Make Symbolic Link. You’ll get what looks like a copy of the folder.

    3. Drag this over to the After Effects Application folder (you may need to change permissions for it), delete the original Scripts folder and rename the Symbolic Link folder to just ‘Scripts’ so everything looks normal again.

    4. Repeat Steps 2 and 3 on your other machines (once Dropbox has finished sync’ing on them)

    Essentially what we’ve done is to make a cloud sync’d ‘Master’ Scripts folder and then placed virtual copies of it for After Effects to load.  The beauty of this method is that when we now add a new script (or any file or folder) to our Adobe After Effects CC > Scripts folder on any machine, the script automatically gets pushed up to the cloud and back down on to the other machines, via the SymLink master.  Since AE only loads scripts at startup, you can even do this whilst running multiple versions of AE and nothing untoward should happen - the new scripts are not loaded until you restart AE.

    I use Macs but the same technique should work on Windows - some info on symlinks here.  Because the Master folder is the origin of the symlink, you could even use it across OS platforms, and a small studio could sync scripts across several machines, provided they are signed in to the same Dropbox account.  These are untested theories though - so far I’ve just used it across two Macs.

    November 7th 2014

    Yanobox Nodes is a plugin which renders points in space and connections between them.  It’s been around a while but didn’t receive too much traction, possibly due to only being available on the Mac, and limited functionality.  With the recently released version 2, a ground-up rewrite of the rendering engine has enabled a big increase in the level of creative control. It is also available for Premiere, Final Cut Pro and Motion but I only have experience of it in the AE host.

    Nodes 2 is great fun to play with.  There are a lot of presets you can use as a springboard for exploration and it’s easy to create something that looks appealingly complex. It feels solid, adapting resolution when scrubbing in the timeline well, and renders fairly fast too.  The UI itself feels solid and mac-like.  I’m not a big fan of the standalone UI’s (from Video CoPilot, for example) which feel like ports of Windows code and are often unresponsive.  Nodes is more like Trapcode Particular in that you have a long list of parameters, many of which are greyed out if dependent on other options.  Yanobox has made some strange decisions about the order of the modules, such as having the ‘Transform’ and ‘Oscillator’ (Noise) groups before the basic ‘Form’ object generator, and some of the naming is unconventional, but they don’t cause too much of a headache once you get used to them.

    Nodes has a unique rendering engine that features certain accelerated fonts as well as supporting 2.5D sprites of up to 4096 pixels - a resolution that most other particle generators don’t offer.  There’s no support for orthographic views (although Custom 3D views work fine) and no Depth of Field. But there is a fast internal Motion Blur calculator which is good to see.  Strangely, there is also no option to render lines on top of points, but since you can always duplicate the layer and flip the checkboxes on and off when you’re ready to render, it’s no big deal.

    The only thing that frustrated me about the plugin is the lack of options when transforming from one 3D form to another.  Grids, rows, circles, spheres, waves and spirals - all seem to flip through some kind of twisted wave-like shape without giving you much control.  You always have the option of using your own OBJ sequence, but you then lose some of the other controls, over orientation of text for example, and add substantial overhead to your workflow. Nodes also has a method for cycling through points in a carousel fashion, highlighting one at a time, which works really well.

    Like a lot of 3D particle plugins, Nodes 2 tends to generate a certain ‘look’ unless you stretch it beyond some of default options.  The most distinctive features of Nodes are it’s curved lines and more sophisticated text and sprite tools.  It would be perfect for a lot of David McCandless style dataviz, as well as FUI work (it was already put to good use in Ender’s Game).  With a bit more imagination it could also be useful for clocks and counters, maps, tracing and trailing 3D forms and even generating rudimentary grass and hair as well as a lot more, I'm sure.  The generative 3D data look has become increasingly popular in the last few years and this plugin offers both a welcome expansion and an alternative functionality to others in the same vein.  It's ease-of-use over a complex setup in fully-fledged 3D is likely to make it ideal for smaller fast-turnaround projects or for combining with other techniques to add an extra textural component in the composite.

    April 25th 2014

    There's no doubt that the workstation is becoming more and more of a niche product.  Those working at the top-end of VFX or medical research still need the fastest computers money can buy; but their puddle, once occupied by everyone from print designers to architects, has been shrinking around them for a long time now.

    Apple have stopped selling the Mac Pro in europe, pending an unknown "something really great", whilst Intel's roadmap is shifting towards mobile and lower-power, embedded chips as consumer demand for raw processing horsepower slows down, and we enter the post-PC era.  Even HP, the world's second largest manufacturer of workstations, claim they "need to be in the tablet space".

    What will this decline of the market mean for our video hardware tools? That they become more costly seems a likelihood, but perhaps there are new opportunities there as well. The kind of vertical integration that we are seeing companies like Blackmagic newly adopting, could become compelling again.  Avid sold off their consumer business last year, and with Final Cut Pro becoming less useful to many, they are starting to look like they might pull out of the nose-dive.  Adobe have been working more closely with Nvidia than ever before, which makes me wonder if there are opportunities for the likes of Autodesk and AMD, for example, to do the same.  Will it continue to be enough to simply make software?

    As fibre-optic connectivity speeds spread around the globe, offsite render farms become cheaper and more viable to more people.  Adobe are planning for future versions of their video software to directly access remotely stored and processed media. Could it be that soon we won't need onsite rendering power at all?

    Within the studio, the speed of the thunderbolt interface gives manufacturers opportunities for greater expandability than ever before.  It's easy to imagine an extremely modular, upgradeable workstation system - a central IO hub with external GPU, render farm and so on. But it's not so easy to imagine a company both capable of and interested in making one.  That would have been a perfect opportunity for the Apple of old, but these days the profits in it would be so small compared to their personal products that they could never justify all the investment in R&D, could they?

    There are certainly some interesting times ahead.  I try not to let the tools play too much of a role in the design phase because I think you can just as easily allow your role to become defined by your capability in software, hardware and tech know-how as you can by your limitations.  Some jobs I am commissioned for require a render farm, and some I can do entirely on a laptop - Maybe I won't need a workstation soon!

    February 28th 2013

    Most designers who work with broadcast media spend all day looking at a rectangular 'canvas' of 16:9 proportions.  But it wasn't until last week that I found out where 16:9 actually comes from.

    I recently upgraded my main working monitor from an old Apple 30" to a new NEC 27".  I swapped from 16:10 ratio to 16:9, and I really noticed the different shape.  I've never really liked 16:9 as much as 16:10 but I didn't immediately have an explanation for why I felt this way.  16:9 has always felt slightly too long to me.  When you place a head and shoulders in it, there's a bit too much space left over, and I often find it tricky to arrange a good composition based on thirds within 16:9.

    It turns out that 16:10 is very close to the golden section, and having used that ratio with countless Daler Rowney sketchpads as a kid I think the proportions must be ingrained in my subconscious.  16:10 just feels like the perfect rectangle - not too close to a square like 4:3 always was, and not too much like a shape of two sides (left and right) like 16:9 sometimes is.


    The golden section has a long and illustrious history, so where did 16:9 come in?

    According to this document, 16:9 was not originally conceived to be a display aspect ratio, but as a new standard for electronic movie distribution. A kind of democratic average of some of the most common ratios of the day - from 4:3 to Cinemascope - the best ratio to 'contain' other ratios with minimal letterboxing.

    Then it gained ground in TV manufacture and became 'Widescreen'. Nowadays a lot of people watch full-screen movies on their computers, and for some reason LCD panels are much cheaper to produce in 16:9 than in 16:10. Two reasons why 16:10 is disappearing from our desktops.  Apart from our sketchpads, that is.

    April 12th 2012

    I designed the characters which formed the backbone of this years Promax conference identity for Knifedge:


    We knew the conference theme was going to be something to do with the so-called digital revolution so I played around with ideas and inspiration from Paperrad, Vivienne Westwood, Barbara Kruger and Sheperd Fairey, but in the end it was all a bit too abstruse to have the necessary instant impact. The first step in the right direction was to give in to the obvious and use iconic revolutionary figureheads - Che Guevara, Lenin and Mao. I had fun making Banksy-esque spray stencils of these guys proclaiming the digital revolution with iPads and 3D specs.

    Jon was adamant that more irreverent humour was the way to go. So I changed the look to something more from Warhol and Terry Gilliam, which I thought would animate more expressively and have more stylistic integrity with the scripts that were being developed. The Knifedge team then went on to design the website and marketing materials, and animate a series of short films based on the characters, including this fantastically camp opener:


    November 29th 2011

    I love the posters that Tom Purvis designed for clients like the L.N.E.R and Austin Reed between the wars.  His economy of means with colour and composition is so bold and perfectly balanced that I imagine the images virtually jumped off the billboards in the 1930's.  His figures rarely have faces, yet they always have character and a hint of a story to tell.  You can just imagine what the old fisherman in this picture is thinking:



    Purvis produced a well-known series of posters in 1931 called 'East Coast Joys' which, when placed side-by-side form a continuous image.  It's not easy to appreciate how clever this is just from seeing a few thumbnails though, so many years ago when I was teaching myself Adobe Illustrator I reproduced the set as a single image (Click it to get a better idea):



    I believe the intention was that they could be used on both smaller stations with one or two hoardings, or like this at perhaps a London terminus where longer platforms and concourses provided more space whilst requiring something that would be even more effective against the increased competition for people's attention.  Seen as a whole here, you can appreciate the use of such a limited palette and the classical device of composing the image around a figure-of-eight in order to keep giving the eye something to move on to.  The tricks of perspective and scale that Purvis uses are something that would never have worked in a more realistic style, and they remind me of some of the faking of relative scale that we often have to use with motion work in order to steer the audience's attention across space in a harmonious and timely manner.



    Something like Purvis's style shows up quite a lot in contemporary motion work, partly because angular, simplified rotoscoping is fairly quick to execute. Daniel Kleinmann's beautiful titles for Casino Royale and The Thomas Beale Cipher spring to mind.

    July 28th 2011

    I've done a fair bit of freelancing around studio's in London, and for the last few years I've also had some responsibility for hiring and directing other freelance designers and animators.  Here's a few things I personally think I've learnt, aimed at any newcomers to freelancing.  If you're after tips specifically about motion design freelancing, check this page out for some good advice.

    1. Look after your body

    Okay, it's a little obvious; but your No.1 asset is your physical (and mental) health.  Without it you won't be doing very much of anything, and the effects of your day-to-day routines will last long after you've retired.  If you spend a serious amount of time working on a computer in your own home or studio, invest in a good chair that will minimise the harm to your back from all that sitting.  It doesn't have to cost a fortune -  I picked up a HAG chair on ebay for about £100.  If you don't already use a Wacom tablet and you spend most of your day using a mouse, you might want to consider switching to a graphics tablet to prevent RSI or carpal tunnel syndrome.  Drink water throughout the day to help your concentration, especially in hot or air-conditioned offices.  If you drink enough then you'll be needing to get out of your seat more often too, which is something health advisors recommend for the desk-bound anyway.  Having a hobby that is different to your freelance work, something more physical or based outdoors like DIY or sports, can really help thwart the VDU blues.  If you're the workaholic type, maybe try to include some more bodily activity into your work like life-drawing, model-making or camerawork.

    2. Network in the real world

    It's surprising how many successful freelancers barely have an online presence at all - who they are and what they do is already well known to many potential clients through their existing clients and their presence in the industry.  Meeting people in person is almost always more beneficial than emailing and phoning for new work opportunities.  If you don't feel too confident networking in the pub or cafe consider doing a good course in something like public speaking or assertiveness which might actually give you a few tools to improve things.  In London there are tons of ways to meet people in the creative industries.  My own line of work has regular events like Promax, Glug and See No Evil which anyone can attend, not to mention less frequent festivals and conferences.   If you feel confident enough, it can be very productive to ask someone in a senior position to simply critique your work.  You need to go into something like that with the right attitude as it's not strictly an opportunity to network and you don't want to come across as desperate or naive, but if handled well it can help your confidence and possibly lead on to other things... after all, recommendations are statistically THE most successful route to getting the best new business. 

    3. Become an employer

    You learn a hell of a lot about your clients' point of view simply by being in their shoes for a week.  Look out for an opportunity to sub-contract another freelancer in your line of work and then advertise online for applicants and go through the whole process of being hiring someone like yourself.  Although you may think you're putting yourself out of a job, it will open your eyes to what the people that you submit your work to are going through when you respond to a job posting from them, and you'll gain valuable insights into what it's like to commission creative services.  Wading through literally hundreds of candidates' portfolios takes a long time, and you will soon see who stands out and why.

    4. Track your time

    Klok, Toggl, SlimTimer, OfficeTime... There is loads of free software to keep track of your hours spent being productive.  Of course it'll help you if you have to clarify your hours to your employer, but more importantly you'll soon build up a very accurate impression of how much you are actually working, and how long different types of tasks take you to complete.  This in turn will enable you to make more accurate quotes and manage your time (and other people's time) better.

    5. Get on the cloud

    If you're moving around, operating on different companies machines or switching between your own laptop, desktop and smartphone, sooner or later you'll want to get a few things synchronised.  Microsoft Exchange Server is now offered as a part of some personal webhosting packages, but it's possible to get much of the same functionality with free services.  Use IMAP email instead of POP and set it up on your smartphone too so that any drafts or sent messages exist on all your devices rather than just one.  Sync your calendars by subscribing to a Google Calendar in iCal or Outlook and on your phone.  Contacts on just about any device or service can be synced with Plaxo. For essential and often-used files I swear by Dropbox, which also functions as an FTP replacement (and secure, if you password-protect files in a zip folder). I use the excellent desktop application Notational Velocity, which syncs with the free iPhone app Simplenote so that I can automatically sync any notes, lists, ideas, reminders, blog posts, scripts, recipes or whatever.  With Apple's iCloud on the horizon, a lot of this stuff looks to be made even simpler for the Mac-centric.

    6. Get a good accountant

    A good accountant will effectively pay for themselves.  It took me many years of doing my own accounts before I evolved a system that was truly optimal, where an accountant could have put me on the right track much sooner.  Taxation and VAT laws change more often than you'd think and if you don't keep up you could be overpaying, or underpaying and possibly incurring a fine in the future.  My accountant probably saves me at least three days a year doing tedious book-keeping - three days that I can be working on something I love.  Be aware that not all accountants are the same: some of them will merely file your books properly, others might actively recommend 'creative' ways you could pay less tax.  Try to find one that has some experience with sole traders and media-types and is a 'taxation advisor' or similar.

    7. Balance your spending

    When a big payment goes into your account it's tempting to go straight out and buy a new Mac / Bicycle / Wardrobe / Holiday.  Maybe you do need or deserve those things.  But I think you'll fare better during the first year or two if you remain as frugal as you can, and get a clearer picture of your average earnings throughout the year.  Then you can begin to put together a monthly budget so you can set money aside for those things and also be better prepared if you have a dry patch.  A general guide is to put aside 30% of the amount you get paid for tax and expenses.  I use a system of two bank accounts, the 'work' one gets all the invoices paid into it and pays for all my business expenses. I then pay myself a monthly amount (drawings) into my 'personal' account.  Neither of them are technically business bank accounts so the costs are minimal, but they are both with the same bank (Co-op and Smile) so when I transfer money between them online, it happens instantly.  It's worked really well for me, and speeds up my book-keeping no end too.  

    8. Keep a database of contacts

    Experienced freelancers sometimes end up with just a handful of clients who provide them with work year after year.  But when you're starting out I think it's always a good idea to be looking for new opportunities to excel yourself and to help others.  It can seem like there are an endless number of potential clients and it soon gets hard to remember who's who.  So keep a database that details all the contact you've had with each potential client so that you can stay in touch with the frequency and in the manner you choose.  At the very least you need to know who it was you spoke to and a record of when it was.  But it's little things like what they said about where they used to work or who their boyfriend is or something like that which can lead to a better connection between you.

    9. Be discreet

    Design is a mercurial and difficult-to-quantify commodity.  The reasons why one designer or studio is chosen over another have a lot more to do with personality than you might imagine, as Tip No.3 will soon demonstrate.  Whilst I doubt that it's possible to successfully change your entire personality, you can at least try to not sabotage your chances of clicking with people.  If you slag people off, gossip, lie about yourself, solicit other clients on paid time or show up with a hangover then you'll be damaging your chances of being hired again.  Some people seem to think freelancing somehow goes hand in hand with being 'rock'n'roll'.  I'm inclined to think that the opposite is actually true because you come into contact with so many different people that well-rounded diplomacy is likely to fare better than extroversion. It's probably best to simply be friendly, positive, fairly modest and 'nice'.  Then, when you've left the job behind your work can really shine for itself.  I'd also say that whilst it's good to build relationships, be wary of revealing too much about your private life, even if your colleagues do - it's just not very professional in the long run.

    June 14th 2011

    I designed some stings for Ericsson earlier this year (at Saint) which were showcased at Mobile World Congress.  Around the time I was coming up with the work, Satya Meka was beta-testing his new plug-in Plexus, which I got my hands on, and it soon became a vital component of the look we were creating for Ericsson.  Satya himself was also pretty impressed with how we were bending his code to our will, and I agreed to work on some promotional material for the plug-in.  Unfortunately I became too busy with other commitments to finish it at the time (sorry Satya!) but I was able to let him use the Ericsson clips themselves to demo Plexus.

    Since then, I've received quite a few emails from other animators wanting to know how I achieved the results I did.  Plexus is very customisable, but with it's origins in Satya's love of generative and data-visualisation graphics, it isn't always easy to get Plexus to form and animate representative images like the globe-network that I had designed.  One of the secrets lies in spending time creating 3D OBJ file sequences which Plexus can read.  I used Cinema 4D, but any application which can export OBJ sequences would do just as well.

    For anybody playing around with Plexus who is frustrated by lack of access to a 3D program, I've got two files for you to download and play with.  The first is a single OBJ point cloud representing major urban centres around the globe (only 7364 of them, I'm afraid) and the second is a simple sweep nurbs OBJ Sequence which is the starting point for the missile-like trails in this video:

    I know that Satya and Lloyd who runs aescripts are working on a tutorial showing people how to do this kind of thing without using OBJ's.  But the globe wouldn't be possible without them.  Also... I can't help wanting to point out that I wouldn't have developed these techniques if I hadn't had a design problem to solve and a strong vision of what I thought would work well for the client's brief - that's the real secret!

    Now, if only I could apply it to my self-initiated projects...

    June 5th 2011


    Last week, both Adobe and Apple unveiled new software: CS5.5 and FCP X respectively.  The greatest changes to the Creative Suite are within Premiere Pro, Flash and After Effects, the latter being the only one of the three that I use regularly.  I don't use Final Cut very often these days either, but I followed the announcement because I have a vested interest in Apple's commitment to the Pro market (which they have been neglecting somewhat in the last few years). I mostly use Apple computers.

    Final Cut Pro X seems to offer at least one or two paradigm-shifting ideas: the kind of features you didn't know you needed until you were given them.  This is what Apple are often so good at, and it's the reason why I prefer OS X over Windows.  It was reassuring to see that they are still applying this broad thinking to a professional product.  In contrast, Adobe seem to have a juggernaut on their hands with the Creative Suite, and far from re-imagining their products they just keep bolting on extra features.  The last few additions to After Effects have focused on middleweight compositing tools to complement the middleweight NLE that is Premiere.  This seems like rather a limited vision of the software to me, and I find myself siding with Angie Taylor:

    "I’m more interested in making things look surreal than real! I want tools that help me push new boundaries and inspire me creatively."

    So what could Adobe do to really improve After Effects for those of us with a more... artistic persuasion (dare I say)?

    More control over basic properties.

    One of the things I love about After Effects is that we have a fairly natural, almost painterly control over the image that is lacking in 3D apps which are founded upon simulated laws of physics for light and space. In AE we can use 3D layers as mattes for 2D layers. We can apply blending modes to them. We can have 2D layer 'containers' of advanced 3D effects (Trapcode etc).  There are some gotchas and issues with the way things are implemented, but once you understand the idiosyncrasies of the timeline and render order you get a lot of artistic control.  Yet it could have been taken further. Motion blur can be enabled per layer - so why not Depth of Field? Why can't we keyframe motion blur? I'd also love to be able to set hold keyframes for a layer's parent, which would greatly simplify a lot of animation tasks.  Whilst the Expression Timeline script goes some way to interpolating expressions on and off, I'd like that to be easily keyframeable too.  Why do we not have reflection properties to complement the shadows as Motion 4 did? And now that we've finally been given light falloff, what about adding camera (visibility) falloff?

    Advanced Render Order

    Something I've always admired about AE over Photoshop is how non-destructive it is. But for my liking, AE's 2D layer feature set borrows too heavily from Photoshop's simplistic paint-mask concept.  The layer order in the timeline becomes useless as an indication of render order as soon as we mix 2D and 3D layers, collapsed 3D precomps and so on.  Why not take things a step further and bring in something like smart adjustment layers.  This would provide a better system of applying single, unified effects to multiple layers.  At present, you can either set that up by pickwhipping expressions, or by precomposing if you don't mind all the layers being flattened into one in the render order.  Neither is very elegant.  What I imagine is smart effects, which effect all layers of type 'x' so that, for example, I can apply a single levels effect to all my 3D layers tagged 'background' in just a few clicks, knowing that any layers I create in the future with those properties will also be adjusted.

    Tags and Groups

    Which brings me on to the subject of tags and selections.  AE's support for layer sets (in the form of labels) is too rudimentary.  The 'Zorro' script goes some way to providing a tag system, but it should be done better, natively.  I also want menu commands or a palette for 'Select all 3D layers', 'Select all layers with animated properties' and so on.  Many people have asked for simple layer groups in the timeline so we can twirl-up a bunch of layers not in use.  I almost wonder if it might be worth considering a more fully-fledged, combined grouping and parenting function such as that in Cinema 4D, perhaps even dispensing with 2D layers altogether.  The layer stack = render order could still function for objects sharing a Z depth just as it presently does.

    Live 3D

    Many people have called for AE to offer support for basic 3D extrusions, or OBJ support, which to an extent is what the Zaxwerks plugins provide.  Personally, I think we should be aiming higher, and even halfway decent 3D requires a whole host of modelling and texturing tools which are best left to dedicated applications.  What I would love to see instead is some kind of 'Edit Original' for cameras, lights and certain 3D nulls, which would open up my corresponding scene in Cinema 4D or Maya.  Upon returning to AE, my 3D layers' properties would be updated and my footage items would also be auto-replaced with my new 3D renders.  I guess this won't happen until Adobe owns a serious 3D application because they would need to rewrite the entire AE 3D engine to bring it into line with the 3D app.  But until they do this they are leaving the door increasingly open to their competitors.


    Physics simulations such as we see in apps like Cinema4D, Motion and even Anime Studio Pro, would really require a whole new feature-set, as they need to be 'run' in order to calculate their outcomes.  After Effects has traditionally eschewed this kind of thing because the processing power required often bogged down the underpowered desktop hardware that AE has to run on.  The old, slow Particle Playground effect is a case in point, and Apple Motion's knack of crashing under strain also proved that desktop computers aren't quite ready for sophisticated realtime physics yet.  With the Roto Brush and it's 'freezing' of spans, Adobe has introduced a very basic interface for caching complex calculated values without having to actually bake them into keyframes.  I wonder if something similar could help with the GUI of 3D physics simulations, effectively freezing groups of 3D layers into cached 2D outputs upon running a simulation.


    Expressions are undoubtedly one of the most powerful native features of After Effects, allowing us to concoct recipes and simulations of simple behaviours that can save a lot of time and offer new styles to our work.  There are a lot of things Adobe could do to make them easier to work with, like including a dedicated expression editor panel, visible line numbers, colour-coding and auto-complete/hinting.  One simple feature that would make expressions so much easier to learn would be an output panel that displays the changing values of variables as a comp is previewed.  Multiple expressions should be easier to work with, rather than the hack of having to look them up from text layers.  True global functions with greater 3D access and effectors like 3D noise or repulsion/attraction would create enormous possibilities.

    Drawing tools

    AE's Paint tools seem to focus on animating the paint strokes themselves.  Whilst I'd love to see better frame-by-frame drawing and onion-skinning tools in After Effects, geared more towards line-testing character animation, it's something that other applications like Flipbook or TVP do well enough as a standalone.  Where I think AE could improve is with it's vector tools.  Shape Layers were a boon when they first appeared but it's such a shame that they have not been improved with successive versions.  I'd like to see some of Illustrator's drawing tools like the Pathfinder, the Blob brush and variable width strokes adopted in AE so that

    it could begin to be used as a cartooning tool, not just a graphic symbol framework.  Within the time-based tools we could really benefit from smoothing controls and morphing hints for animated paths.  Support for brushes could bring us boiling line effects.  I'd love to be able to automatically link-up a portion of two or more animated paths, possibly via point-based expressions, to ease figure animation and shading.  Global live colours with a swatches palette is also a bit of a no-brainer.

    GUI Improvements

    This is probably the biggest area in which Adobe could learn from Apple, and I'm not talking about multi-touch.  Something I liked about Combustion was the way that the interface altered quite significantly depending on what you were doing.  Of course it is useful to be able to keep an eye on one thing while doing another, but the mini-interfaces of more complex tools like Shape Layers, the Puppet Tool or Trapcode Particular are just so poorly presented to the artist.  I'd like to see custom controlled HUD's in my comp window much like those in Cinema 4D, without having to resort to rolling my own via complex expressions (see the second video on this page for an excellent example).  I'd like to see methods of controlling the interface on a per-comp, per-layer and even per-timespan basis so that, for example, when I select my camera in my precomp, the comp window automatically jumps to showing me 2 orthographic views, alongside a second viewer with the Active Camera view in the main comp.


    Adobe is in a unique position: with software like Photoshop, Illustrator and to a lesser extent After Effects they have something of a monopoly which can stifle innovation.  They also seem to operate under a slightly old-fashioned model of responding to quantitative market-research over initiating ideas from key thinkers like Apple and Google do.  And they've taken the path of marketing bundles of applications as suites whilst struggling to usefully integrate them into one another and develop them all in the same timeframes

    When Adobe decide to make a competitive product like InDesign, Lightroom or Premiere, they produce great results and everyone benefits, but for the last few years the apps I use daily have languished and bloated somewhat which I think is a real shame.

    With FCP X, Apple have decided (presumably many months ago) to take the risk of rebuilding their biggest piece of pro software from the ground up.  This will fight back against the looming threat of Premiere, and bearing in mind the tech that Apple bought with Shake and Color, we may yet see something to rival Smoke or After Effects.  Fingers crossed.

    April 22nd 2011

    Just over a year ago I was involved in pitching for a show with a working title of 'Surgery Live'.  We didn't win the pitch and the show was aired on Channel 4 as 'The Operation: Surgery Live', but since it was such a challenging and interesting brief I thought I'd share my ideas for it here.

    The brief for the titles was this: "More of a programme identity than a sequence, and the emphasis is very much on the live action and the unpredictable nature of live surgery on TV, hence the ‘identity’ has to reflect the drama and the strength of content will do the talking."

    At first I concentrated on a typographic treatment that would bring the sense of 'live', using a stylised halftone suggestive of video scan-lines.  This was easy enough as an idea, but wasn't really reflecting the strength of the content.  I was very aware of the danger of sensationalising what was a slightly controversial programme.  The producers had assured me that the show was intended and would be marketed as a serious, educational, eye-opening use of the televisual medium rather than anything voyeuristic or exploitative.  Live surgery had previously been hosted by the Wellcome Collection to a private studio audience and this was essentially a public extension of that venture.

    Nevertheless, the first job of any TV title sequence is to make the audience want to watch the rest of the programme, so whatever I did had to be engaging and at least arouse curiosity.  We weren't sure at this stage whether the audience would already have been forewarned by the continuity announcer about the explicit nature of the imagery so I didn't want to be too graphic in depicting surgery.  The whole notion of curiosity and the inside of the body as a secret place influenced my thinking, and I started making some quick tests around a 'seeing through the keyhole' type of idea.


    These were a step in the right direction but somehow seemed too obvious and risked looking unrecognisable and obscure.  I remembered the video Intro did for Primal Scream's Kill All Hippies, and something about it's subtractive use of negative space was stuck in my mind.  Then I realised that what I could do was flip my idea on its head, and instead of using a keyhole, I could cut out (like a surgeon...?) the explicit body parts from the footage, and leave us seeing the iconic surgeon's hands and implements working away at... something.  I was pretty sure that with the right footage we could get an eery, absorbing sense of 'Unheimlich', and as a graphical juxtaposition to something so alive and real it seemed natural to place very cold, objective and historical imagery - the iconic Gray's Anatomy illustrations.


    It was a shame we didn't get the job because I thought there was real potential in these boards and was looking forward to making them move.  If anything was wrong with them I think it was simply that they really needed at least 20 seconds to work, and I think we were supposed to have 10 to 15.  The wrong kind of over-delivery, although that wasn't the reason they turned us down.  Here's what was finally broadcast instead:

    June 7th 2010