14 minute read
Cover story: BIM is bust
How should AEC data work?
If all BIM software were developed fresh today, with modern computer science we would not face the limitations of tools and formats we have inherited. Greg Schleusner, director of design technology at HOK talked with Martyn Day on how we could free data from monolithic BIM silos
At NXT BLD 2021, we invited Greg Schleusner, director of design technology at global design, architecture, engineering and planning firm HOK to come over from New York and give a presentation on some of his R&D projects ( nxtbld.com/videos/greg-schleusner). Schleusner has been exploring how the digital tools we use today act as barriers for collaborative design and pondering if something can be done about it
In his insightful talk he looked at inefficiencies in creating 2D representations, problems with annotation, the massive amount of data duplication, lack of knowledge capture and reuse, and the lack of openness.
If you have not seen the presentation, it’s probably a good idea to watch this now, as the rest of this article is a continuation of that thought process, based on Schleusner’s subsequent research.
When most people talk about their frustration with BIM tools, it usually concerns speed, the need for workarounds and the cost of ownership. But if we are to seriously address the problems the industry faces today, we have to take a lower-level approach. We should not just expect faster evolution of tools used commonly today but question the very data structures on which they are built. We have plenty of formats; we also have plenty of incompatibility, data loss, data overshare, data wrangling and data in proprietary silos that inhibit collaboration.
“Our data functionally doesn’t work with our process. It’s all built for throwing over a fence in a very linear, transactional process,” explains Schleusner. “But, in reality, in design construction you have this overlap, where construction planning needs to start during design phase and yes, unfortunately the design is going to change.
“These iterative changes set off a series of continuous updates to files. In the way our world works, every one of these updates is a copy and, every time something changes, we end up with potentially having to redo our work downstream because you’re not in control of that data.
“Our solution for this technology problem are legal agreements. This is a counter productive way to think about how we can solve our issues, literally hoping to change the legal system. This leads to discussions about better contracts, and the hope that high-level trust agreements can be signed, where you let other firms access and edit your data, which only works in a very small number of cases.” Data distribution Schleusner sees all kinds of problems with the technology solutions we have today for collaborative working, defining rules on partitioning, as to who can do
what and who can change what. Today’s platforms can’t properly audit changes, there’s no global track history of who did what. “This approach is a poor match for technology, and we should never have to rely on these sort of arcane overlays of ‘let’s be friends’ to get a project to work”, explains Schleusner. “There’s this fundamental problem where our data needs to become distributed but not particularly owned. The way it works today just enforces an inefficient linear process. I’m wondering if we can get the software to work independently of the way people work. “What we really want is to have a feedback loop, which we can’t ‘‘ There’s this fundamental problem where our data needs to become distributed but not do at all today, because of all sorts of limitations - the very nature of files, proprietary formats, incompatiparticularly owned. The way it works today ble applications, deeply just enforces an inefficient linear process Greg Schleusner, HOK siloed data, poor interoperability… the list goes on. We have to think differently. ’’ “What’s happening to our world is it’s becoming unbelievably highly packaged, where everything is highly recursively related, as the project progresses, each discipline requesting changes to the design through the phases. This process is accelerating, but our data structures and everything else are structured to support a linear process. There’s still no easy way for a contractor to suggest a change that’s easily adopted by a design team. “Changes come in many scales. It could be a material choice or it could be as complex as redesigning a whole roof structure or a façade system. “If you look at something as simple as a
HOK’s Circadian Curtain Wall concept draws on biophilic design to offer building occupants abundant natural light while minimising solar heat gain Images courtesy of HOK
Decision making latency - data silos within the AEC industry
door, in our current design process even that has a sort of recursive nature. You place a door, you have an external driver that client has a preference for - frame styles or particular hardware. Then, the building code or egress needs drives other functionality drivers on that door, which might need to update the door capabilities and other design issues might feed in which mean you have to go back in time to change the style. The point being - this is a nonlinear recursive process.
“Now add construction into the mix and you just get more of the same. We need to think about how to handle the fact that this is a distributed problem; not one person, there is no single entity that owned that door completely through the project. In fact, it has five pieces of data which have been owned by different groups. We have this distributed responsibility problem which we haven’t really addressed. “My best example is architectural finishes, products and materials. There are at least thirteen different places they could show up in project data, with unique representations and or owners. In the US the specification will describe its finished quality, its testing criteria, and so forth. It shows up in the product list, cost, what it looks like in a rendering, its environmental capability or impact etc. Just this one material has so many representations. Then you get a problem where the way we handle this, is to use humans as the interoperability layer. Every representation of that data exists in some application-specific format that basically then has to be interpreted and recreated by a person. The chance of errors multiplies with the complexity of the project.”
External framework Schleusner thinks there are many things that should be part and parcel of how data works: distributed decision making, linked datasets, information distribution, change control and versioning. Today’s software creates ownership through representations. For example, a Revit version of a slab is
owned by the architect, but slabs are also owned by other disciplines. “There is a need to separate ownership and existence out from these representations,” explains Schleusner. “Design systems need a concept that sits external to all this, a root object slab, without the geometry. It’s not any particular discipline’s representation, it’s just a node telling ‘‘ HOK is first looking to the world that if you want to describe me further, this is what you attach to. “You may want to address internal imbue it with spatial workflows with addressing, to automatiRevit and other applications. The cally associate representations from two separate software tools, but then goal then will be you can hang all these to start a global interdisciplinary, internal discussion with and external pieces of data on top on it. Even for firms that are something like a slab, interested in a material specialist developing this further might own the LCA [Life cycle assessment] information about associated materials, the architect might be responsible for life safety attributes beyond the geomet’’ ric representations. “As you build this out, there’s at least three or four structural representations or design representations, classifications and building codes and all the good stuff. “What’s nice about this approach to data
All the data, decisions and processes attached to the lowly door throughout design and construction
is, if you externalise the concept, any one of these representations can calculate the graph [model]. The clash filtered variant will show what kind of clashes for classification can be done. Fabrication geometry could be another representation. Using this tree graph structure, the other benefit is you don’t have to keep it up to date all the time. Let’s make our data capable of working independent of ‘a product’.
“There are advantages at working at a file per object level,” explains Schleusner. “This way, you can assemble the required representation, such as requesting to see the structural model or run a query ‘what elements changed this week?’ It’s representation on demand.
“The really important thing is, if we were to output a single IFC file it’s just too latent to be useful. “If we go granular and move to a partial IFC file per representation, we can do close to real-time exporting of just changes, not full monolithic files.”
Looking at a demonstration, Schleusner showed me a Revit user in an office, using a plan view to move furniture around, while at home, another session of Revit is updating in real-time as it receives a stream of partial IFC updates. This is a proof of concept and doesn’t have to be the same BIM system, or discipline. for BIM were all about the single building model. A concept that seemed to make sense but in retrospect was flawed because it meant shoving all data, from all disciplines, into one file.
These files were proprietary and got very large, very fast and the systems were not even originally designed for teamwork. All of today’s BIM systems are designed to ultimately produce drawings and, as a result, the industry has managed to drown itself in those.
Near future requirements of the industry go significantly beyond this and links between BIM and digital fabrication are embryonic, and that’s another silo problem to throw into the mix.
The AEC industry is unfortunately too broad to have ‘one format to rule them all’. Externalising the data framework from today’s monolithic tools can lift the data out of the proprietary silos and be used to connect disparate applications via a distributed link to a multi-part, multi-graph solution.
In his research, Schleusner is also not looking to completely reinvent the wheel and seeking to embrace open formats and services: IFC, IFC.JS, USD, MaterialX, GBxml and Speckle, to name but a few. Each has its strengths and weaknesses, but they offer ways to extract information now from today’s proprietary tools.
Looking at Speckle and IFC.JS, it’s now possible to have ‘mini-servers’ as plugins, broadcasting every design change
By taking a granular by object approach to data, Epic Games can get incredible performance on massive datasets in Unreal Engine 5, as demonstrated in Matrix city
from each application to a Common Data Environment (CDE) or straight to another enabled application.
For instance, IFC.JS can ‘live broadcast’ BIM geometry changes from Revit to Archicad, streaming IFC component information and updating the models in real-time. Schleusner is looking to combine this with an external multi-representational granular graph representation to disrupt the current limitations on collaborative working by having a shared representation, which doesn’t require editing each other’s models or drawings.
He envisages the system will also track all changes across a project at a granular level. And, once you get all that data out of these proprietary systems and hold it externally, there is potential to do all sorts of interesting things.
Development
HOK is pushing ahead with fleshing out and developing Schleusner’s in-house concept, and the global firm is connecting to many of the developers that offer open solutions who are interested in addressing this fundamental issue, or who they can learn from. For instance, this granular by object approach to data is how firms like Epic Games can get incredible performance on massive datasets in Unreal Engine 5, which can be seen in the recently released Matrix city (tinyurl.com/MatrixCity).
For now, Schleusner is building proof of concepts, while fleshing out the functionality of the external system. There are immediate issues that he is looking to address in HOK’s own internal workflows with Revit and other applications, and so will be their own proving ground. Schleusner says, “The goal then will be to start a global discussion with firms that are interested in developing this further.”
Conclusion
As an industry we are in interesting times. While the construction industry rushes to digitise its entire end-to-end processes the tools on which this is being built are already creaking and built on concepts of file-based data which creates silos.
Looking at the cloud-based systems these are just enhancements to the current document-based workflows which are one of the fundamental reasons the industry is so inefficient. For now, ensuring a worker on site gets the right PDF is progress but everything up until that point does not flow smoothly.
CDEs are a necessity because we live in a land of Babel. Models need to be broken down because we are trying to stuff too much data into archaic schemas. The contractual and legal issues strangle collaboration and, even then, there is little trust between parties, causing rework and the creation of multiple BIM models.
While products like Revit are 20 years old and there are no 2nd generation plans, it’s easy for the industry to fixate and look for the ‘next Revit’ or evaluate other BIM solutions. Here, Schleusner’s opinion is that we have to address the problems at a much lower level than ‘authoring tool’ and establish data structures that fit the way the business works bypassing the productivity-killing problems that the current generation of tools have bequeathed us.
The beauty of his approach is that there are early wins with the current generation of tools. It will even help those who are just using one in-house BIM system, make integrating tools from different vendors less of a headache, enabling parallel development without the project file merry-go-round and actually having all project teams working together to flesh out the model definition, with a full transaction log.
The industry needs to look forward to an open approach to data as soon as possible. Ultimately an external database, using open formats, will help break the bonds of deeply resented proprietary lockins, draining the silos and levelling the playing field. While HOK is developing this to smooth its own data flow issues, it’s great that it recognises that the industry at large could benefit and is driving for practice collaboration on its development.
AEC data structures: how data can be linked to a slab node
The distributed definition for a slab
Check out Greg Schleusner’s NXT BLD 2021 talk - nxtbld.com/videos/greg-schleusner