net 261

Page 1

special 2oth anniversary issue The voice of web design

Issue 261 : December 2014 : net.creativebloq.com

Prototype with origami Discover Facebook’s free and easy design toolkit for Quartz Composer

The top 23 JavaScript libraries every designer and developer needs to know

Exclusive reports, interviews & more! Project

Build a modular CMS tailored to your needs


FEED

Events

Generate London

Audience meditation, pug GIFs and banjo playing – Oliver Lindberg reports from a conference like no other Event report Date: 26 September 2014 Location: London, UK URL: generateconf.com

PROFILE

On Friday 26 September, the second annual Generate London conference, presented by this very magazine, took place in the lavish Grand Connaught Rooms in Covent Garden. It was attended by more than 300 web designers and devs from around the world. The one-day, two-track event kicked off with a keynote by creativity evangelist Denise Jacobs, who encouraged everyone to get in a creative frame of mind with a spot of group meditation. Her thought-provoking talk went on to discuss ways to control your brain’s natural

22

Oliver Lindberg is net’s editor and has been working with the title since the Iron Age. He was once described as a “dancing machine”

january 2014

creative capacities (slides at netm.ag/jacobs261). The presentation set the tone for what was to be a very inspirational day. Freelance web designer Meagan Fisher had prepared possibly the funniest slideshow ever seen at an industry event. Her presentation was packed with useful advice on how to be ‘less terrible at the business of design’, masterfully illustrated with a variety of cat and pug GIFs. Meanwhile, the second track kicked off with Google’s Jake Archibald – presenting shoeless as always – introducing us to the ServiceWorker, which enables us to build apps with good offline behaviour. Other morning highlights included Elliot Jay Stocks’ presentation on advanced web typography (“Every time you say font instead of typeface, Erik Spiekermann kills a kitten,” he warned) and Zoe Mickley Gillenwater, UX designer for Booking.com, who gave a very honest talk about the CSS tips and tricks she had learned the hard way. The mindset of

“I can’t start until I know enough to do it perfectly” resonated with the audience, and Gillenwater reassured everyone that this industry is constantly evolving and you simply can’t (and don’t need to!) know everything (slides at netm.ag/gillenwater-261). There was no danger of feeling a bit sleepy after the mouth-watering buffet lunch, as Aardman’s Gavin Strange squeezed his 40 minutes full of boundless enthusiasm. His work showed the audience not to be afraid of trying new things, and his infectious passion reminded everyone why we got into this industry in the first place (slides at netm.ag/strange-261). Jeremy Keith’s closing keynote covered progressive enhancement, stressed that websites don’t need to look exactly the same in every browser and emphasised that it’s our job to keep the web moving forward. The day drew to a close with Meagan Fisher taking to the stage once again to interview Dan Cederholm. The conversation took in the Dribbble cofounder’s conversion to Sass, and he managed to weave a quick banjo demonstration into one memorable HTML/Sass analogy. As you do. Photos of Generate London can be found at netm.ag/photos-261, and videos of all the talks will be on the Creative Bloq’s YouTube channel soon (netm.ag/CByoutube-261). Check out generateconf.com for info on both Generate NYC and London 2015.


Feed

EVENT GUIDE AntarcticJS Date: 14-22 Nov 2014 [TBC] Location: Antarctica “No, it’s not a joke,” say the organisers of the ‘first JavaScript Conference on the last continent’. Details are yet to be confirmed. But, reassuringly, you can attend online – so no need to pack the thermals. antarcticjs.com

HOW Date: 17-19 Nov 2014 Location: San Francisco, US The San Fran leg of HOW invites you to meet the designers, developers and programmers behind projects for eBay, Twitter and more. howinteractiveconference.com

JSConf.Asia 2014 DATE: 20-21 Nov 2014 Location: Singapore Airbnb’s Spike Brehm and open source software dev Wei Lu will speak at Asia’s “most influential web developer conference”. 2014.jsconf.asia

CSS CONF Date: 9 Dec 2014 Location: Oakland, US This one-day event focuses on frontend design, with speakers including Dropbox designer Daniel Eden and HackerYou lead instructor Brenna O’Brien. cssconfoak.land

beyond tellerrand Date: 11-13 May 2014 Location: DÜsseldorf, De Tellerrand’s next Düsseldorf outing may not come around until next spring, but early bird tickets are already sold out – so if you fancy it, shake a leg. beyondtellerrand.com

december 2014 23


Opinions, thoughts & advice

designing with Data

The future of the web is ...

28

Peter Smart explores a future web that you can touch, responds to your every whim, and is integrated seamlessly into your environment

Data versus intuition Data is useful, but it can’t replace creative intuition. Designers need to put some faith back in their instincts, argues Natasha Irizarry

interview 32

John Allsopp, author of ‘A Dao of Web Design’, argues that Sass isn’t all that great, and predicts a future centred around JavaScript

Earlier this year I accepted a position at a dating site with millions of users. I was hired to design user experiments, which became a battle between data and my intuition as a designer. Eventually, I found the right balance of combining what the data indicated, and what my intuition was telling me, to hit my goals.

Following my intuition

What are we worth?

42

How much are your skills really worth? Sean Johnson suggests pricing projects based on the value to the client, rather than time taken

My first project was to improve revenue by refining an upgrade page. Two experiments were built to test against the control, both of which broke the site’s overall template. Experiment A included a credit card form within the page, and Experiment B included a refreshed design of the control. The experiment designs were more aesthetically pleasing, but still under-performed the control upgrade page by .5-1 per cent.


Opinion

The control page was fairly skeletal: it included a list of features on one side of the page and a subscription selection form with a call to action. Analysing the data revealed that Experiment A performed the worst of all three, and breaking the template did nothing for subscription rates. While refining things, the control’s template and the purchase flow (which included the credit card form popping up in a new window) remained intact within the experiment. Though not pretty, staying close to the control’s design kept the user’s trust intact.

Applying the data

Getting it just right One thing the product had going for it was a brand that is valued by its users. I started creating emails that were sent out to a small percentage of the user base, and saw results quickly. My idea was this: if even one of our users felt some kind of emotion from this email and they engaged with it, something new could be learned. Stepping out of the current style guide let me focus on the brand itself. I created fun and playful emails that performed favourably for both the company and users, but not without criticism from my teammates for pushing the limits of the style guide. But of course, more tests had to continue to prove this method was working.

Emotion had to be applied in the designs, through tone or imagery, in order to get results that impacted the user, as well as our metrics A version of the email was tested that had been adjusted based on what the data was saying, resulting in a robotic message that under-performed the control – leaving me to assume that the data couldn’t tell me, or anyone else for that matter, how to design anything related to these issues. Emotion had to be applied in the designs, through tone or imagery, in order to get results that impacted the user, as well as our metrics.

The greatest risk In the end, taking a very risky, humanistic approach made the user feel not just wanted, but needed as well. And what user isn’t needed? The goal wasn’t just monetisation, it was also for the user to invest in the product. Making assumptions that you know what the user wants and that the data will

tell you everything, well, ‘makes an ass out of u and me’. Designers become obsessed with innovating and forget the real reason why we do what we do, especially in the context of designing with data. We have to consider data and our own intuition, applying both to what we build. Failures are inevitable – until we apply what we learn from how we fail we will continue to chase the local maximum instead of innovating. My opinion is this: don’t hold data above common sense. Take chances. Follow your intuition. Let data support your design, not define it.

PROFILE

After restarting the experiment, there still weren’t any measurable performance indicators. It felt like hitting a wall. I started making one change at a time, but it soon became clear that this process yielded very small productive results – and the devs started to get annoyed with all the tests. Then I had an epiphany: I was chasing the local maximum (netm.ag/maximum-261) – I had hit the testing limit. I wanted to innovate, but most of all I wanted users to feel enough passion about the product that they would want to pay to upgrade to the premium service. My new goal was to figure out why these experiments kept failing. To solve this issue, the right questions needed to be asked. The problem had to be more than aesthetics, right? That’s when the combination of data and my gut feelings worked together, and solid hypotheses were developed to test.

Natasha (@natashairizarry) is a self-proclaimed UX evangelist. She works as a consultant for companies who have user experience and design related problems

december 2014 27


VOICES Essay

28

december 2014


Essay

2O YEARS OF Innovations

The future of the web is... Illustration by Ben Mounsey

Imagine a world where you can touch products in a webshop, walls transform into touchscreens, and passwords are unheard of. It’s not as far away as you might think, says Peter Smart

The internet as we know it is less than 8,500 days old. Yet, in that time it has revolutionised the way the world lives, works and plays. In a generation, we’ve seen the world mapped in amazing detail; real time, face-to-face communication made possible with people on the other side of the planet; and the vastness of human knowledge retrievable in a fraction of a second. But this is nothing compared to what is about to come. Scott Cook, the founder of Intuit (intuit.com), claims “we’re still in the first minutes of the first day of the internet revolution” – and with the technological breakthroughs happening at the frontiers of our industry, it’s hard to disagree. No one can predict the future, but to explore the future technologies set to revolutionise our industry tomorrow, it is possible to examine leading research and cutting-edge developments happening at the forefront of our industry today.

The future is ... tangible Imagine being able to touch the web. The human body is incredibly adapted for touch. Millions of touch receptors cover our skin, allowing us to better navigate our physical world. Yet tactile sensation in the digital world is almost non-existent. However, today major tech firms including Apple, Samsung and Disney are developing haptics for the web: the ability to pick up our devices and not only see, but feel what we are looking at. Electrostatic Vibration (netm.ag/vibration-261) is a new technology that can manipulate the precise amount of

friction a finger feels when travelling across a touch surface. By increasing and decreasing this friction, it is possible to dynamically recreate the feeling of bumps, ridges, edges and texture. The implications of this type of technology suggest incredible possibilities for the web. A tactile web would revolutionise shopping experiences. Studies that show that touch has a significant, unconscious grip over decision-making processes. When I am able to feel how soft the coat I am looking to buy is, how will that alter the amount I’m willing to pay for it? There could also be implications for education. Studies show it’s not just how we think, but what we do while we think that impacts how we absorb information. How are haptic technologies set to offer amazing new possibilities for learning via digital mediums? Finally, imagine Skype. Michelangelo said, “To touch can be to give life.” A tangible web will allow us to be on a face-to-face call with a loved one on the other side of the world, and be able to touch them. The implications here for virtual human communication are enormous. This is why even the advances that we see today, like sending a touch to someone through the Apple Watch, point to an amazing future tomorrow.

The Future is ... adaptive The Internet of Things has captured the imagination of the media, manufacturers and business around the world and, according to Cisco, is set to become a $19 trillion global opportunity over the next decade. Although recent

december 2014 29


Gallery Inspirational sites

k arolina szczur

Sensational design and superb development

HTML5, CSS3, RWD, SVG s , Interchange

charitywater.org/september Charity: Water charitywater.org

Well designed sites for charity organisations are hard to find, but Charity: Water is certainly one of them. The September campaign features beautiful, fully responsive imagery. “With some of our larger images, we tried out some srcset mimicry. We used a datainterchange attribute on the img or div (in the case of background images) tags that designated properly sized images,” says Christina Lutters, who was responsible for frontend implementation. “The proper image was downloaded and shown using a Zurb Foundation plugin called Interchange.”

44

december 2014

One of the team’s designers, Mike Smith, created some beautiful painting and hand-lettering to enhance the campaign. “To preserve the crispness of his work as we scaled the lettering responsively, we used SVGs,” Lutters adds. The most challenging aspect was to create a smooth flow between sections. “Because it’s responsive, the height of the sections changes, altering the height of the images and the flow of the gradients. We had to work a combo of unusual background image placement (using :after ), gradient breakpoints/overlaps, and some light noise to blend it all together.”

Karolina is a designer, developer, photographer and writer at &yet. She’s also an open source aficionado and runs CSSConf in Oakland, California w: thefox.is t: @fox


Inspirational sites

HTML5, CSS3, RWD, grunticon, js animations

responsivewebdesign.com Ethan Marcotte ethanmarcotte.com Karen McGrane karenmcgrane.com

Ethan Marcotte coined the term ‘responsive web design’ four years ago in an A List Apart article that was shortly followed by a descriptive book on responsive theory and practice. Now, together with Karen McGrane (author of Content Strategy for Mobile, published by A Book Apart), he’s launched a site featuring workshops, public events and a podcast. And it’s all about responsiveness. Perfectly aligned with Marcotte’s book branding, the site sets speed as its first concern. “Performance was a priority for us. I set up a simple performance budget, and it was actually freeing to view the site’s design through that lens. There’s still plenty of work we can do, but it’s satisfying to see how quickly our site loads over a spotty 3G connection,” says Marcotte. “We’re using Grunt to manage common tasks and a number of Filament Group-produced utilities to keep progressive enhancement and performance front-and-centre,” adds Marcotte. He also mentions such tools as grunticon, grunt-criticalcss, CSS and JavaScript loaders, as well as fixed-fixed.js. The result is a beautiful and faultless example of responsive web design in action.

december 2014 45


PROJECTS Prototype

Download the files here! Download all the files you need for this tutorial at netm.ag/origami-261

A b o u t t he a u t h o r

Aus t in Bal es w: austinbales.com t: @arbales job: Product design manager, Facebook areas of expertise: Design q: who would play you in a movie of your life? a: Ryan Phillippe

prototype

Build a prototype with Quartz Composer Austin Bales walks through how to create an interactive

photo viewer prototype using Quartz Composer and Origami

video Austin Bales has created an exclusive screencast to accompany this tutorial. Watch along at netm.ag/origamivid-261

90

Quartz Composer (netm.ag/QC-261) is a powerful motion graphics tool from Apple, originally used for everything from broadcast graphics to screensavers. It’s a non-linear, visual tool – it works by connecting dots and ideas, rather than writing code or defining fixed keyframes. Although it wasn’t originally created for interaction design work, its flexibility and speed make it an ideal environment for designers. Origami (netm.ag/origamiFB-261) is a toolkit for QC, developed by Facebook, that makes prototyping easier by encapsulating common patterns and concepts used by interaction designers. Prototyping with QC is fast: you can change a composition (edit values, make new connections between patches, swap in a new image) on the fly, without having to recompile or wait for that result to take effect in your

december 2014

prototype. This makes QC much faster than a code environment that depends on reloading, or a videobased tool that requires re-rendering. Although this tutorial walks you through to a final prototype in a straightforward manner, its much more common for designers to play around in QC, transitioning different ports with different values. Building skills in QC takes time. This tutorial is meant to introduce you to this powerful tool, give you a starting point for future exploration, and lay a foundation for building and practicing. We’ll cover the basis of creating compositions: working with images, transitions, animations, switches and the most simple of interactions. To get started, you’ll need to get both QC and Origami by following the instructions at origami. facebook.com.


Prototype When you create a new Quartz Composer (QC) composition, you’ll be presented with an Editor and a Viewer window. The Editor is a grid view that holds patches – for now, just a Clear . The Viewer will be blank, since all that’s in our document is a Clear field of black. As you insert more patches into the Editor, you’ll see the Viewer update in real time.

01

Inserting Origami’s Phone patch is the first step in any mobile QC composition. To insert a patch, press cmd+Enter to bring up the Library window. Type ‘Phone’ into the typeahead and press the Enter key to insert the patch into the Editor. You’ll see a white iPhone appear in the Viewer. Follow the same steps to insert the Layer Group and Phone Dimensions patches.

02

Patches are representations of functionality, data or images. They receive input and provide output through ‘noodles’ connected to labeled ‘ports’. Data flows through these ports from left to right. Connect the Pixels Wide port on Phone Dimensions to the corresponding port on Layer Group by clicking the dot on the first patch, and then clicking on the dot on the second (you can also drag between

03

Step 6 Change the X Position, Y Position and Opacity values to explore how the on-screen layers behave

ports to connect them). Then you can connect Layer Group ’s Image port to Phone ’s Screen Image port. .Layer Group is a macro patch that flattens the layers and contents inside it into a single image – think of it like a Smart Object in Photoshop. All of our subsequent work will take place inside this patch. Double-click the patch to open it. You can get back up in your heirarchy at any time by pressing the Edit Parent button in your toolbar.

04

Now that we’re in our Layer Group , drag the file PhotoGrid.png into the Editor window. You’ll see both an Image and Layer patch appear in the Editor, and see the image itself appear inside the Viewer. You’ll notice that the Image and Layer patches are connected via their Image ports. It may sound complicated, but it actually makes a lot of sense. Individual images in QC might be connected to multiple layers, or you may decide to run them through filters or other patches before connecting them to a Layer .

05

To get familiar with the way Layers work in Quartz Composer, select the Layer patch from the previous step and press cmd+T to view the patch’s parameters. Play around with the X Position , Y Position and Opacity ports a bit to get a sense of how layers on screen move and behave. You’ll notice that QC works on a coordinate plane system with 0 being at the centre of the screen, and that Opacity works on a decimal system, running from 0 to

06

Step 1 QC compositions are presented in an Editor window (left) and a Viewer window

expert tip

Fixes and enhancements Origami includes several fixes and enhancements to Quartz Composer, including support for inline values, linear ‘noodles’, Retina support, and some convenient window layout shortcuts. Whenever I begin a composition I click Window > Resize to Thirds to give me a usable window layout. I also recommend disabling Keep Patch Library Visible in Preferences > Editor .

december 2014 91


PROJECTS JavaScript

Download the files here! All the files you need for this tutorial can be found at netm.ag/p5-261

A b o u t t he a u t h o r

S co t t G arner w: scott.j38.net t: @scottgarner areas of expertise: Creative technology, interactive design q: who would play you in the movie of your life? a: A computer generated viscacha

Javascript

Explore creative code with p5.js

p5.js seeks to bring the power and flexibility of Processing to the web. Scott Garner introduces the new tool for creative coders

video Watch an exclusive screencast of this tutorial created by Scott Garner at netm.ag/p5Vid-261

98

The new p5.js (p5js.org) is a library designed to bring the power of Processing (processing.org) to the web. It aims to introduce artists, designers and educators to the world of programming while also offering versatile tools to bring devs and engineers into the visual arts. Let’s dive in and create our first ‘sketch’. Our goal is to build a drawing tool that transforms a simple image into a field of animated stars. First, we’ll define a few global variables and write our setup() function. p5’s setup() is run once, when the sketch is loaded, so it’s the ideal place to handle initialisation. var hintImage, skyImage, stars = []; function setup() { ... }

december 2014

Inside our setup function, we’ll create a canvas and hide the mouse cursor so we can draw our own. By default, p5 adds an outline around shapes – we want to disable strokes in this case. createCanvas(800,500); noCursor(); noStroke(); Next, we’ll load a pair of images. One will serve as the background – in this case, a night sky scene. The other will be the ‘hint’ image – the black and white design (seen overleaf) our final design will be based on. The idea is to put most of the stars over black pixels in our hint image, to recreate the design


JavaScript in our background scene. It would be easy to create these images with p5’s text and drawing tools, but for the sake of brevity we’ll use static assets. hintImage = loadImage("//bit.ly/hintImage"); skyImage = loadImage("//bit.ly/skyImage");

The draw function

That’s it for setup() ! Another key function is draw() . It’s called in a continuous loop, which is helpful for animations and adding elements over time. function draw() { ... } Within the Draw function, our first task is to fill the canvas with our background image. p5 doesn’t automatically clear the canvas between draw() calls, so we need to do this every frame or we’ll end up with some strange accumulation effects. To place a loaded image on the canvas, use the image() function and give x and y coordinates for positioning. image(skyImage, 0, 0);

p5.js aims to offer versatile tools to bring devs and engineers into the visual arts Next, we’ll grab the current mouse location and store it as a p5.Vector using createVector() . This object comes with handy functions to deal with points in space, but we’re mostly just using it as a container.

Focus on

A bit of background Processing is a programming language and development environment originally designed to teach the basics of computer programming within a visual arts context. Since its creation in 2001 by Ben Fry and Casey Reas, it has continued to expand and evolve into one of the most powerful platforms for creative coding. While screen-based drawing remains a central focus, it is now used for everything from generative sound projects to digital fabrication. The goal of p5.js is to take the core ideas of Processing and adapt them to the modern web. This not only means a shift from Java to JavaScript, but also the introduction of new features like HTML DOM manipulation, and a broader reach than was previously possible. It also means you have the full power of the web at your disposal, allowing you to interface with other JavaScript libraries, access popular web-based APIs and easily target any device with a browser – from desktops to laptops to mobile phones. One of the biggest advantages of working with p5.js is 13 years’ worth of tutorials, examples and other learning materials created for Processing. In most cases, core principles translate directly, and code can often be adapted with only a few minor tweaks. Again, teaching is an essential element in p5, both for artists and designers interested in programming, and for programmers interested in the visual arts – so it’s a great place to start, regardless of your skill level. The creation of p5 is a collaborative development process led by Lauren McCarthy, with contributors from around the world. The project is always looking for enthusiastic developers, designers, artists, teachers, authors and, really, any other role you could imagine. If you’d like to get involved, please get in touch at hello@p5js.org. All contributions are welcome.

var position = createVector(mouseX, mouseY); Using our newly-stored mouse position, we can draw our cursor. We’ll set the drawing colour with fill() by passing RGB values and use ellipse() to draw a circle at the mouse location. fill(255, 192, 0); ellipse(position.x, position.y, 8, 8); We only want to draw new stars while the mouse is pressed, so we’ll check p5’s mouseIsPressed before proceeding. If the mouse is down, we need to calculate a good place for our next star to end up. We’ll do that with a custom function called findPixel() , which we’ll define later. Once we have a target, we’ll create a new instance of a custom Star object (more below) and push it

Announcing p5 p5.js was released to the public last summer. See the interactive announcement video at hello.p5js.org

december 2014 99


PROJECTS WordPress

Download the files here! All the files you need for this tutorial can be found at netm.ag/modularGit-261

A b o u t t he a u t h o r

Mark L l obrer a w: bluecadet.com t: @dirtystylus job: Senior developer, Bluecadet areas of expertise: HTML, JavaScript, WordPress, Drupal q: who would play you in a movie of your life? a: Tilda Swinton

Wordpress

Build modular content systems in WordPress Mark Llobrera walks through how to use Advanced Custom Fields

to create a WordPress CMS that is flexible as well as structurally sound

video Mark Llobrera has created a screencast to accompany this tutorial. Watch along at netm.ag/modularVid-261

108

The key word that has informed my work for the last few years has been ‘flexible’. A website needs to be flexible for the end user, morphing to meet them on their chosen device. But our content authors need flexibility, too, in how they create content for the site. They need to be able to mix different kinds of content to create a compelling site. One of the ways we talk about this flexibility for both users and authors is using the term ‘modular content’. Loosely defined, modular content means breaking down our content into smaller chunks that can be combined in many different ways. It is not a

december 2014

new idea, but it is one that is given new urgency by our new multi-screen, multi-device web. A ‘page’ is simply too big a unit – it must be broken down into smaller components to enable it to bend and change to better serve its authors and users. WordPress has a few options that can be used to add some of this flexibility into a CMS. Elliot Condon’s Advanced Custom Fields (advancedcustomfields.com/pro) has long been an essential plugin for extending the capabilities of WordPress in order to build custom CMSs. The plugin has two features that are well suited to our goals of creating a CMS that supports our flexible design


WordPress system, namely the Flexible Content Field and the Repeater Field. These features allow designers and developers to create modules and craft markup to represent those modules, but they also give content authors flexibility in how they order those modules. So instead of needing many different page templates, we can use a smaller number of flexible content types and templates that are able to render a wide range of possible pages.

Identifying our modules Let’s start with an example: a product details page template. A common scenario is to have a large, evocative image at the top, supported by a section (or sections) highlighting individual features (with an optional image). Finally, callout blocks to related products could follow. We can start by identifying our design system. There are a number of ways to do this, but at the moment I like to sketch out my site’s possible templates and what they will need to support. Once I’ve done that, I start circling common visual

A website needs to be flexible for the user, morphing to meet them on their chosen device structures that can support those content modules. From there I can move on to wireframes that clearly identify those modules. In the scenario shown in the image on page 111, the template calls for three modules: A) a full-width image with text layered on top, B) a centered image with text running underneath it, and C) a series of callout blocks that have a thumbnail image with text layered on top. Our goal is a template that can support any of those blocks, in any order: ABC, ACB, BAC, BCA, ABBC and so on. Certain orders will make more sense given the desired hierarchy of the page, but we want to be able to support all options.

Case study

Our toughest client: ourselves At Bluecadet we’ve had a lot of experience building modular content systems for our clients. Our toughest client this year, however, might have been ourselves. We entered this year with the goal of redesigning our website (bluecadet.com), and it took a lot of hard work to create a design system that could accommodate the range of our work. For our case study pages we needed support for varying numbers and combinations of text, video and imagery. Additionally, we needed the system to be intuitive for our teammates who would be entering and maintaining content. We ended up creating a design system with modular content blocks, including: a full-width single image, an image slideshow, a video player, a video loop, a pull quote, and regular text. Once we started building in WordPress, Advanced Custom Fields’ Flexible Content Field allowed us to create each content block as a separate layout, all shared by one post type and one template file. Our content authors appreciated the ease of creating and re-ordering the different module layouts, which let them put together the right layout for each project case study page.

Creating modules in WordPress Now that we have an idea of what kinds of content we’ll need to support, we can start creating our modules in WordPress. We will use Advanced Custom Fields Pro (advancedcustomfields.com/pro), a paid plugin that includes the Flexible Content Field and Repeater Field. Once you have the ACF Pro plugin activated, go to the new menu item Custom Fields > Custom Fields and add a new Field Group . Call it ‘Core

Mix’n’match Content authors create layouts from a range of content modules

december 2014 109


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.