
11 minute read
Lighting Design and the AI Revolution
By Ray Molony
The breathtaking advances in artificial intelligence technology are revolutionising a host of professions – will lighting design be one of them? Ray Molony speaks to some leading designers who have begun to incorporate its power into their work.
From accountancy to graphic design to (gulp!) journalism, artificial intelligence is upending a swathe of white collar professions. So what impact will AI have on lighting design? Much of the initial interest has been in its increasing prowess at image generation, especially platforms such as Midjourney, DALL-E and Stable Diffusion, but one could easily imagine a role for it in fixture specification and documentation.
Faraz Izhar, lead lighting designer with Dubai-based multidisciplinary practice AE7, has been one of the pioneers in exploring the tech and its capabilities.
Izhar says he started playing with AI and lighting design ‘out of curiosity.'
‘Lots of architects were experimenting with it already, and they were using them to develop the concepts as inspirational images. So that’s how it caught my attention.'

Faraz Izhar, lead lighting designer with Dubai-based multidisciplinary practice AE7, is incorporating AI-generated images into storyboards and mood boards at the concept stage.
‘So I started exploring it, and at first I couldn’t get it working satisfactorily until version 4 of Midjourney came out. It took me a bit of time, but when I got the hang of it, I just couldn’t stop. I spent about three to four hours every weekend just talking to it, just generating images after images, environments after environments.
‘And the thing is that it’s getting more responsive as it learns. When Midjourney version 4 came along, the prompt writing techniques started getting easier, and it started responding very well to lighting.
‘If I tell it to imagine a building, maybe arctic or gothic style, maybe a hotel or a fort, and I tell it to intricately light up the facade elements, it starts doing that and then it starts extrapolating.
‘You can tell it to light it up this way, that it has to be a play of contrast, for instance. You can say ’it has to be a brilliant and stunning scheme’ because it understands a very conversational language.
‘It literally understands that language, so it will get what you are trying to say and it will light it up accordingly.
‘But what you cannot do is tell it to specifically light up a particular element such as a column with a linear LED – you can’t do that.
‘You can upload an image of a fort or an exterior but when the output gets produced, it's not the same image. It will tweak it to suit its own understanding and its own thinking at that particular time.
‘But if you tell it more parameters which are close to your uploaded image, then it will produce something similar but not the exact image.’
He believes that AI-assisted lighting design – when you can upload an image and get render of a lighting scheme – is a matter of time.

AI-assisted lighting design is a matter of time.
‘The way this is progressing, the way the space is developing, I would say that that day is not far off. It's learning very fast and it's getting very responsive. I think it's going to happen very soon.’
And it's beginning to understand lighting techniques. ‘When I started, it didn’t do that,’ says Izhar.

Midjourney already understands colour temperatures and concepts such as linear lighting.
So the key question is: What will be the practical uses of AI to a lighting designer?
Izhar uses it as more of an inspirational tool at the moment and is including AI-generated images in his projects now.
'For instance I’ll use the images in a storyboard or a mood board as the initial concept. Usually what happens is that we designers tend to grab the reference images from Pinterest or from the Internet. So this is like your personalised Pinterest.’

A concept for a suspended pendant that Farz Izhar created using version 4 of Midjourney.

A concept for a suspended pendant that Farz Izhar created using version 4 of Midjourney.
It’s also useful for avoiding copyright issues, says Izhar. If there’s an image that you particularly like and want to use, you can upload it to the bot, tweak it and maybe add a bit of extra layers of lighting.
‘Then it will produce an image to a much similar intent. But crucially it won't be the same image and there won’t be a copyright issue. So I think it's quite useful for concept narratives.
‘When AI can produce lighting design renders, I would say that would be the time for us to worry, because clients would be doing everything on their own. They just have to learn how to operate it and how to speak to it, which is not at all difficult.’
While clients don’t know how to achieve the lighting effects using practical luminaires, Izhar expects them to begin to generate ideas on their own [using the AI tool] and then approach the designers to say ‘this is the kind of thing that we’re looking for – now we want you to make it practical and constructible.

Izhar expects clients to begin to generate ideas on their own and ask designers to make them practical.
‘I would say that this could start happening in maybe a year or two. But it's going to happen soon. It's developing very fast at the moment.’
Are lighting designers in danger that the creative element will be removed from them and that they will merely be asked to make a concept happen practically? ‘Yes, that's right. Just basically serving as glorified draftsman.
‘The technology is already creating quite a bit of stir in the realm of the digital art. The artists are upset because they believe it’s copying their styles and their ideas.
‘For example, if you search for certain illustrators on Google, rather than displaying their original images, Google is pulling images from the Midjourney archives instead, images which have been copied from their style.
‘Their intellectual property has been taken by the computer and not credited.’
He fears the same issue could arise in lighting design. ‘Perhaps you could have one of your projects being used by AI to create an image that somebody else uses, and you don't get any credit or any acknowledgment. It's doing that already.
‘You can literally tell it to copy any particular style, such as Zaha Hadid’s, for example. It can copy any any famous architect. It can copy the styles and create the images based on what exists now and what exists around the world, which is made by those architects.’

If requested, AI can copy a particular style such as that of the architect Zaha Hadid.
He thinks it’s too early now for organisations such as the IALD to formulate a policy on AI but fears that day is fast approaching.
‘But the way this space is developing, I think that kind of scenario would happen, and they would have to think about it in two years, maybe.' ■
Thomas Paterson
DIRECTOR, LUX POPULI MEXICO CITY

Paterson studied AI as part of his mechatronics degree at the University of New South Wales in the 1990s. He believes the lack of meaningful data sets will hold back its effectiveness in lighting design.
‘AI is going to have a huge role in small optimisations, in taking the principles of a design and working out the right densities of lights in a system, working out the right timings etcetera.
But I think we're further away than people think, because the things that are dazzling people now like ChatGPT are fundamentally working in a space where there is a massive data set which is highly evaluated. So you can tell what's good writing and what's not by how much it sells, by click counts and so on.
But how do you actually evaluate whether a lighting design is good? Otherwise you're just steering towards an average of what's been published.
In terms of actual design thinking, it's millions of miles away because there's no data set that actually qualifies whether a lighting design served the client's interests, whether it worked with integrating with the facade or the building systems, with any of those aspects.
It’s too complicated, and there are just no data sets for lighting that are meaningful. Who can feed in 100,000 examples of projects with their documentation and with evaluation of what worked and what didn’t? Those data sets don't exist, and a lot of that information is confidential.
So, I think it's going to be a useful tool. You should be able to identify which walls are wall washed and with what type of product, and what type of downlights, and so on. And in principle, AI should be able to lay out your plans, do all of your load scheduling, write your spec. There's a lot that it's going to be able to do.
I'm not particularly excited about it any more than any other tool, but I don't fear it. At our end of the lighting design market, we're doing truly crafted, truly detailed lighting design. People come to us because they want us to do innovative thinking. And innovative thinking means understanding stakeholders’ interests, understanding the history of a place, understanding how something is built, understanding how it will be constructed, which are two different things.

The Nos restaurant in Lima, Peru by Lux Populi. The practice worked in close cooperation with Taller MER of Miami on the project.
Photo Credit: Laura Arroyo and Thomas Paterson

The Momentary arts complex in Bentonville, Arkansas by Lux Populi. The architect is Wheeler Kearns Architects.
Photo Credit: Tom Harris.
That huge breadth of knowledge, plus the knowledge of psychology for brand and identity and all these sorts of things are all dependent on much research and knowledge, and we exist because people want that. Otherwise, I’ll go to a sales rep lighting designer. So a lighting designer who is threatened by AI is not a very good lighting designer.’ ■
AI HAS REVOLUTIONISED OUR WORK FLOW
Dan Weissman, senior associate at Lam Partners and director of Lam Labs, predicts that AI will increasingly be able to take on mundane and bureaucratic tasks, freeing up designers to focus on creativity.

Dan Weissman
Using AI to generate visual images can serve a particular purpose. As a tool for idea generation or iteration, it has some interesting possibilities. For instance, I’ll render an image of a design concept in 3ds Max and then upload it to DALL-E, and [the AI] will offer alternative design options. But to me, this isn’t that interesting in the long run.
What interests me more is using AI-based tools embedded in other things. For instance, in the past a render used to take a couple of minutes to render out; it’s very pixellated; it’s pretty grainy. Now we’ve started using an AI DeNoise script with our render engine, iRay. The AI samples all the pixels and guesses at how it’s going to look and within a second or two, the image looks relatively realistic.
The result is that iRay is now exceedingly fast and we’re basically working in real time in digital space. We’re literally moving around a model in real time.
We’re able to work on the thing itself, rather than a proxy. And it’s not just radiosity, it’s all the lighting phenomena.
We’ve been using AI embedded in this programme for four years to significant success, and it’s basically revolutionised the work flow at our office.
I don’t waste time trying to sell a client on an idea with diagrams – that’s silly – I just jump right in and model up some options for them in real time. I’ll pull up the programme, move around in real time, they’ll say ‘I don’t like that’ and we’ll move things around.
So AI isn’t necessary to me to generate scenes or concepts. What I want to use it for is to eliminate the most annoying parts of the design process.
There are three main processes [in lighting design]: visualisation, documentation and specification. I see AI being able to participate in various ways in these, but under the surface.
With documentation, AI can tag hundreds of different items on a drawing. I want an AI that will automatically sweep through and tag every single fixture with the appropriate tag, and keep that tag from overlapping the fixture, and make it readable at all scales.
For me, the promise of AI is taking out all the annoying stuff. I read a great quote online: ‘We don’t need to AI to do art, we need to do all the other shit so human beings can do art’. ■