Melbourne architecture won’t benefit from group thinking The City of Melbourne is forming a panel of experts to take on architectural designs, eschewing the traditional ways of visionary designers, writes Elroy Rosenberg.
12
Melbourne needs to lift its architecture game, so we’re told. Deputy Lord Mayor Nicholas Reece published an op-ed in The Age which, with great gusto, elucidated the dire architectural situation in which Melbourne finds itself mired. Recent reporting into the swathe of highrise developments dotting either side of the Yarra points squarely at endemic poor design: excessive noise, including groaning and creaking; notoriously combustible cladding; and, in Reece’s words, towers which ‘are nothing more than spreadsheets in the sky’. So far, so good — until we unearth the piece’s double-purpose. Not merely an argumentation, it also plays press release: the City of Melbourne, Reece reveals, is proposing a new Design Excellence Program. The program, a reinforcement of the Central Melbourne Design Guide approved by the CoM in 2019, will include the formation of two bodies — the Melbourne Design Review Panel (MDRP) and the Design Excellence Advisory Committee (DEAC). The former is a panel of ‘leading’ industry figures that doles out ‘multidisciplinary advice’ on proposed projects; the latter consists instead of a ‘platform for industry, academia, and community to engage in a range of topics’.
The idea isn’t quite so novel as one might think. There is already a Victorian Design Review Panel (VDRP) established by the Office of the Victorian Government Architect (OVGA). It lists 86 architects, urban planners, designers and more as members on its panel. (The Melbourne panel would initially consist of up to 12 members.) The VDRP began as a three-year trial scheme and was evaluated in 2013 by SGS, an urban public policy consultancy firm. I asked Andrew McDougall, who worked on the VDRP evaluation and is executive director at SGS, how his team of five undertook their almost six months of evaluating. The process, he assured me, was totally independent, although the VDRP, as SGS’s clients, were allowed to read, continuously comment on and provide context for the evaluation as it developed. SGS was set a series of questions by the VDRP relating to how “efficient and effective” the panel was, and how well it “delivered on its objectives.” This was done in part by interviewing all panellists individually, as well as undertaking specific case studies. In the end, SGS deemed the program “effective.”