me

Ben Hawkyard

Senior Pipeline Developer • Absolute Post

bhawkyard1@gmail.com • +44 7948579781

Hello! I'm Ben, and I am a pipeline developer with experience in film VFX and integrated advertising studios. I am currently working at Absolute Post as a senior pipeline developer.

I use Python, Qt and Git every day at work and I also have experience with Linux, C++, OpenGL and database programming. Recently, I've been particularly interested in implementing robust developer workflows to help the teams I am part of build large scale tools and systems quickly and efficiently.

Have a read about some of the places I've worked at/projects I've worked on below and please get in touch if you have any questions!

Absolute Post

I have been working at Absolute Post since November 2022 as a senior pipeline developer. In this role I'm responsible for working with the engineering team, the artists and the other developers to plan and deliver improvements to the studio pipeline as we grow and scale.

Saddington Baynes

I worked at Saddington Baynes from August 2019 until October 2022 as a pipeline developer. The studio worked on a range of advertising and automotive content, and I focussed on large scale tooling and strategies aimed at streamlining the car configurator projects we took on and unifying the pipeline used by different sections of the business.

In particular, I wrote an asset management system which allowed the user to quickly wrangle many shots in parallel. By representing assets as modular units with dependencies linking them together we could build up large procedural hierarchies that could be overridden in a very granular way, allowing the user to make sparse overrides like tweaking the appearance of a single shader in a specific shot, for example. Representing our assets in this way, outside of scene files, meant that we were able to build our scenes headlessly without user interaction, meaning a sparse edit could be made by a user with a minimal set of assets loaded in, and then many renders could be triggered, without having to open render scene files and submit manually. Overall this allowed us to really tighten the feedback loop between making an edit to a CG asset and getting out renders displaying the changes in the wider shot context, which increased our throughput considerably.

This project was a large undertaking so thorough automated testing, documentation, both developer and artist facing, was important. The automated testing in particular allowed me to maintain velocity in delivering features even as the project matured. We also made extensive use of graylog, to log interesting messages and errors to a central server. This would mean that if a user did encounter an unhandled exception we'd receive a report detailing the full traceback, the path to their scene file, and other diagnostic data.

MPC

I worked at MPC between November 2018 and August 2019 as a software developer for the animation department. This role consisted of liaising with animators of all levels to develop tools to solve production challenges. It also involved working with the core software teams to suggest changes and improvements to core pipeline systems and APIs.

I supported the production of Maleficent 2: Mistress Of Evil and had the opportunity to develop tooling to facilitate the transfer of facial capture data from rigging into the animation pipeline. I also produced a tools to add dynamic secondary motion to animated transform hierarchies, bake complex character animation intelligently and verify the integrity of scene contents before was passed downstream.

Framestore

I worked at Framestore from July 2017 to November 2018 as an Assistant Technical Director, in the film department. This role consisted of working with the supervision team on the assigned film to troubleshoot any technical issues that arose, and to develop tools to improve the artist workflow more generally.

While at Framestore I had the opportunity to support the production of Paddington 2, Christopher Robin and Detective Pikachu. In addition to this I worked on a range of different tools, including one to improve the backplate workflow in the animation department.

Education and Personal Projects

I graduated from the National Center For Computer Animation at Bournemouth University in 2017, with a first class honours in computer visualisation and animation. I worked on a range of different projects while I was studying:

Markov Model Music Visualiser

As a research project in my 3rd year studying at the NCCA, I decided to look at music visualisation. I noticed that a lot of the visualisers that I looked at didn't really appear to do much with broader features of the songs they displayed, instead just directly visualising a the frequencies sampled at the current time. At the same time, I was writing a computer program that read song lyrics and using a Markov chain , and could then generate lyrics based off of these.

It occurred to me that if I fed music to the program, rather than strings, I might be able to do some interesting things with the result. I modified the program to do note extraction from raw audio data using a fourier transform, then use lists of notes as the underlying data type with which to construct the model.

Once I had this, I visualised the structure of the Markov chain. Obviously, each internal node had no innate position in the world by which I could place it, which made figuring out how to position them all a bit of a challenge.

In the end, I opted to assign each node on screen physical properties based on its counterpart within the Markov model (e.g. nodes with more connections had greater mass). I then ran a simple physics simulation where connected nodes were attracted to one another, and with parameters the user could control like ambient friction, gravitational attenuation etc. The end result was that a kind of abstract sculpture would form, over some time.

While this was going on, I'd play the song and sample the audio at the current time, repeating the process that was performed to produce the model initially. If I found a node tagged with the same notes which where present, I'd light it up on screen.

Graphically, the project has a few interesting features, including:

  • A lens flare shader, which takes world positions and intensities of lights in the scene, projects them to screen space, and then draws flare effects.
  • A simple subsurface scattering model where the nodes in the scene have light propagated through them (mathematically simple since they are all spheres).

The source code for this project is available here. It requires the NGL and SDL libraries, and a C++11 compiler.

Tools Used:

C++, OpenGL, SDL.

52nd West

My final-major project at Bournemouth University, I created an experimental game with William Stocks. The player is a real estate agent who finds the relationship with his wife falling apart after a deal to get her new business a studio space goes badly wrong. Through the decisions the player faces, they will have to choose to prioritize their career, or marriage.

The game logic was implemented using Unreal Engines blueprinting system, which is great for quickly prototyping ideas. This game is split broadly into two parts, the 2d sections where the user interacts with the characters in a "choose your own adventure style" and interludes where the user explores the 3d environment and is confronted by the results of their earlier choices.

For the 2d sections, we wanted to retain the 3d camera and have it drift between different posters on a cork board, where the user could interact with elements to make choices and advance the story. This meant that we needed to be able to project text/user interface elements onto geometry, which was not well supported by Unreal Engine at this point. To implement it, I created a blueprint class representing an interactive screen, which would spawn a UI on a 2d card and a camera, which would render the UI into a texture, which was then plugged into a shader and applied back onto the 3d geometry.

In some sections of the story, the player also needs to find and interact with items. For this, I implemented a basic inventory system, where specially tagged entities in the scene could be picked up by the player, and optionally stored and attached to the player character.

The game has a fairly complex structure, with earlier choices affecting the shape the story takes later on. After planning out the logical structure on paper, I created a class to store player decisions, which could then be queried by the classes handling the branching structure of the story.

A copy of the game can be downloaded from here.

My Roles:

Pipeline, Gameplay Programming, Lighting, Shading

Tools Used:

Python, UE4.

The Pterodactyl Joust

I produced small game with 4 other NCCA students over a period of 5 months. It was created in UE4 and pits 2+ players against one another in aerial, medieval themed combat.

The most interesting technical challenge I had to overcome here was to have the wing membranes:

  • Behave in a cloth-like, believable manner.
  • Not lower the games performance too much.
  • Have depth, so that subsurface scattering worked correctly.

This was quite a challenge, at the time support for cloth animation in Unreal Engine was not well supported. I had to come up with quite a creative solution to get these desired attributes into the game:

The pterodactyl uses multiple rigs; the main rig drives the entire mesh minus the wing membranes. To create the simulation rig I deleted all of the downward-facing polygons on the wing membranes, and applied ncloth to the remaining 2D mesh. This is then stitched back onto the main mesh using ncloth constraints, so that the main rig also drives the movement of the simulation. Finally, a complete copy of the mesh is retained, so that the movement of the main rig and the simulated wings can be copied onto it at a later date.

Once I had completed a section of animation for the game, it had to be baked out. To capture the detail and complexity the motion, I wrote a script which placed a joint at every vertex of the simulated surface and animated each one to follow the wing. Finally the target mesh was skinned to newly animated joints. I had a tough time skinning the mesh properly, because I had to get the weights from the wing joints and from the main rig itself, and have them both work properly on the final mesh.

This is a fairly convoluted solution, but it works surprisingly well! Each player is driving approximately 1500 joints around the scene, but this is well below UE4s generous limit of 65000 per mesh.

My Roles:

Director, Animator, Simulation, Character Modelling.

Tools Used:

Maya, Unreal Engine 4, Headus UV Layout, Python.