Exploring the ocean of WebGL with the RNLI



The mission

At the end of March this year, the RNLI gave us the task of updating their Respect The Water campaign website to match their updated message for 2017, “Fight your instinct, not the water”. The new focus was to teach people a basic but important lesson, to resist the cold water shock experienced by all when falling unexpectedly into water, and then how to float to increase survival rates.

Our close collaboration with the RNLI meant that we were easily able to coordinate the update to the website so that it aligned with their video advert, in which the audience is taken through the steps of how to float.

poster image for video

After examining the success of the Respect The Water campaign from the year before we discovered that the majority of users were accessing the site via their mobile devices, and that the old interactive piece, whilst captivating, often saw users leaving the site once completed. Using this knowledge we decided to strip things back for the user, and focus on a mobile-first approach that would take the user through the single important message: how to float when falling into water unexpectedly.

We looked at a number of different web technologies such as gamified videos (which we used in the previous campaign), 360 video and WebGL. After looking at what each option could provide, and how we could utilise each of their strengths with the somewhat limited timescale we had, we decided to push for the use of WebGL. It was a technology that we had been interested in for a while but hadn’t had a chance to truly experiment with.

So what is WebGL?

A simple description of WebGL is that it’s a JavaScript API used for rendering 3D content directly into web browsers. The capabilities it offers however are somewhat more advanced. People have built everything from simple platform games, all the way up to full blown virtual reality content using it. You can view a number of experiments that people have submitted to Chrome Experiments at chromeexperiments.com/webgl

Our solution

We started off with the idea of creating an Interactive WebGL piece that would accompany the television and cinema ad, in which the user would be presented with a stylized 3D model of a person who has fallen into cold water, which then automatically plays through each of the steps recommended to get into a floating position. There would also be the option to navigate back through each of the steps to have a look at them in more detail. With this idea in mind we spent a week creating a simple prototype.


Screenshots of our initial prototype.

With the initial prototype complete we had succeeded in creating a basic experience where the user would be presented with a prompt to interact (Swipe to explore). Once the user swiped, the character would fall into the (very flat) water, and would automatically start going through each of the steps to float. The progress bar on the right was intended to show how far the character was through the animation, and also provide a tool to navigate between each of the steps. With this working, we ran it through some user testing to see how people interacted with it.

Why unbiased user testing is important

By sharing the prototype with people who had no prior knowledge about the aim of the campaign we were able to understand how people interacted with the content with fresh eyes. In doing so we quickly discovered that our initial idea wasn’t working. Due to the animations playing automatically and the text either changing too quickly or not at all. Pretty much every one of our users had no idea what the actual message was. They also never realised that the progress bar we had designed could also be used as a navigation.

Our Solution Take 2

This information was invaluable and allowed us to quickly iterate on all of the areas that needed improvement. The first and most simple fix was disabling the autoplay of the animation. In doing so, we then identified our next important change.

When we first started on the project our aim was centered around making the experience more simple. Clearly the progress bar did not fit that criteria. So we removed it completely and went back to the drawing board. After some reworking of the overall UI we refocused the piece so that it was more instructional and interactive, rather than the passive dynamic the autoplay originally created in our first prototype.


It’s all in the detail

Once we had the UI and animations in a state where they were useable, useful and impactful, we were able to add the final bit of polish that we pride ourselves in at Yoyo. First of all, we improved the lighting of the scene to improve the balance of the colours. We also worked a lot on the water itself.

An important part of the animation was that we wanted to keep it performant, when possible. This included removing shadows when we detected the frames per second dropping too low and trimming the fat on the file used create the character itself. We reduced the file size by both reducing the decimal places used in the character model to 3 points, and running the code through some minification. This brought the file size down by around 400KB.

Whilst keeping performance in mind, we also wanted to do something about the flat water. It felt unnatural, especially with a character floating in it. To improve this we added some dynamic waves that had a low poly feel to give the environment some more movement. Once the waves were animating, we were then able to bind the character to the water so that he floated up and down on the waves. It was a fairly small touch that added a ton of atmosphere to the scene.

The final piece

After a few more rounds of testing, we were good to go. In those last stages we decided to remove the initial start screen and take the user straight into the experience so that the new start screen appeared after the character had fallen into the water. The work we did on improving the character animation helped to make this more impactful as we could display a better example of the struggle a person may experience when falling into cold water. It also helped to provide a good follow-on from the other campaign material being used to drive traffic to the website.

This was an extremely fun and humbling project to work on. Now we get to sit back and enjoy the positive impact it will hopefully have on those who find themselves unexpectedly submerged in cold water, whether it’s from falling overboard at sea, to slipping over into a river whilst walking the dog.

You can see the full website at respectthewater.com and if you're interested, you can also explore the Respect the Water campaign 2016. 

Float to live

Fight your instincts, not the water

Everyone who falls unexpectedly into cold water wants to follow the same instinct, to swim hard. But if you float until the cold water shock passes, you will have a better chance of survival.

  1. Step 1 - Fight your instinct to panic or swim hard
  2. Step 3 - Push your stomach up, extending your arms and legs
  3. Step 4 - Gently move your hands and feet to help you float
  4. Step 2 - Lean back in the water to keep your airway clear
  5. Step 5 - In 60-90 seconds you’ll be able to control your breathing

What to do next

Swim to safety, call for help, or continue to float until help arrives

Learn more on floating