A Load Take-Down is a procedure frequently performed by structural engineers to assess the amount of loading carried by the columns of a building into its foundations. It is an important early-stage analysis necessary to inform the choice of column layout and foundation system, but it is also a notoriously tedious and time-consuming process that is regarded as something of a ‘rite of passage’ for young engineers to endure.
Typically, the take-down is performed in one of two ways. Either the tributary areas (the region of loading that each column nominally supports) must be calculated manually for each column on each floor and then tallied up (commonly via a spreadsheet), or a full 3D finite element model of the entire building must be constructed and the forces extracted. The latter requires resolution of a level of detail which is often inappropriate during the early phases of a project and the former is both slow and prone to errors. Most importantly, both methods can require significant re-work in order to adapt the analysis to modifications of the geometry and this limits our ability to experiment and respond to design changes.
RCD’s TADPOLE (TAke-Down Process On Loaded Elements) is an in-house software project that provides a new alternative method that automates and greatly speeds up the analysis. The standalone tool can read in 2D floor plan drawings and assemble them, level by level, into a complete representation of the building. Loading areas and column positions can be automatically interpreted by the tool if present, otherwise the software contains a full suite of drawing tools to allow the engineer to sketch out loads, columns, walls etc. Once this data has been input the software automatically determines tributary areas and performs the take-down. Changes to the input data can be made easily and the impacts assessed instantly.
This eliminates the need for tedious manual calculation and, because the application is designed and streamlined for this specific purpose, there is no need for any extraneous data to be input. Because the tool is graphical, odd results and input errors can be spotted and traced far more easily than in a spreadsheet.
To help further manage the data the results of the analysis can be output to an interactive online dashboard via Power BI, making it easy for the lead engineer and client to interrogate. A full report can also be generated to document the process, results and assumptions. To eliminate re-work, the tool can also assemble the input plans into a full 3D building model that can be exported to Autodesk Robot to form the basis of a more detailed analysis.
This has allowed us to do in hours what would previously have taken days, and in a way that would not have been possible without building the tool ourselves. Commercial software is typically made to be as broad as possible in order to capture a wide user base. This means that it is often poorly optimised for certain tasks. By developing our own tools designed to meet our exact requirements and workflow we can plug these gaps and work more efficiently, enabling us to beat time pressures by responding faster, iterating more often and, ultimately, to produce better, more rigorously-checked designs.
Salamander 3, a new structural modelling and interoperability tool developed by RCD lead Paul Jeffries, is now in open beta and available to download from Food4Rhino. The tool adds the ability to model structural elements such as beams, slabs, nodes etc. inside Rhino and for this data to be exchanged with analysis packages (at present, Autodesk Robot and Oasys GSA).
The tutorial videos below demonstrate how to install the Rhino plugin and utilise some of the basic modelling commands in the tool to develop a simple structure.
A recording of the talk I recently gave as part of the ‘Design Discourse’ series at Imperial is now available on YouTube, here:
Unfortunately many of the animated embedded .gifs in the presentation did not display properly on Imperial’s hardware (computers, eh?), so they have been included below instead – click on each one to view the animation:
SketchPad – the first graphical CAD tool – in operation.
On Tuesday 16th May Paul Jeffries will be delivering a public lecture at Imperial College London entitled ‘Emergence: The development and future of computational design’. The talk will be held in Room 201 of the Skempton Building and begins at 18:30. All are welcome to attend.
For the 2017 Ramboll Leadership Conference in Copenhagen, which took place on the 22nd and 23rd of January, RCD was involved in a collaboration between the Transport and Buildings departments to design and construct a ‘bridge’ installation between their respective stands. We had a little over a month to develop and manufacture the design so timescales were tight and we had several key criteria to consider – the bridge was to support a model train running between the two stands (in reference to the Holmestrand Mountain Station project), it needed to be light and easily demountable enough for us to carry from London to Copenhagen, build in an afternoon, break down in an hour and then return back to London (for later re-assembly in our home office). We also wanted it to form an interactive part of the conference rather than merely being a static display piece.
We approached the project the same way we would any other – pulling together a team with relevant expertise, brainstorming ideas, analysing and developing them. For the interactive element, we realised that business cards made an ideal impromptu craft material and were one of the few things we could rely on most of the attendees to be bringing with them. The decision was thus made to allow people at the conference to leave their business card, folded into a specific 3D form, as part of the bridge’s cladding.
Design of the overall structure progressed rapidly through several meetings, based around a flexible parametric Grasshopper model developed by RCD that allowed for collaboration around real-time adjustments to the geometry. After the examination of several options we settled on a timber shell/arch structure as an aesthetically pleasing, lightweight, robust solution that would reference both Ramboll UK’s expertise in timber structures and previous RCD project the TRADA pavillion and which could be rapidly manufactured and assembled.
Throughout the development of the bridge the geometry was exported to and analysed in MIDAS by the London Bridges team in order to make sure the design was structurally feasible and to guide further refinement of the form and material thicknesses. Additionally preliminary samples of sections of the bridge were laser cut to allow us to physically examine and test the manufacturing process and connection detail design.
In order to enable the bridge to be rapidly assembled and disassembled we wanted to avoid the use of adhesives or mechanical fixings. The connections were therefore designed as simple slotted plates, held in place laterally by a matching slot in one of the plates they joined and restrained laterally by small standard ‘U’-shaped clips, all cut from the same 6mm plywood as the rest of the structure. The nature of the shell form meant that the angle between each panel (per quarter of the structure) was different. Generation of these connector pieces was thus integrated into the Grasshopper model in order to determine cutting patterns for each connector and panel, each of which was also automatically labelled with a number to be engraved onto the inner side of each piece to allow easy identification of which pieces connected together during construction. Each connector also incorporated a small hole through which the line which would support the bridge deck could be passed.
The slots into which business cards could be placed were likewise incorporated into the Grasshopper model, arranged so as to fit in the maximum amount of business cards without compromising the structural integrity of the panels. Due to the variety of panel shapes and sizes no one placement algorithm was found to give consistently good results, consequently two separate arrangement algorithms were utilised to determine slot placement and the best of the two automatically selected for each panel to give the final arrangement.
Foundation design is a key component of any project and this one was no different. Two pedestals were designed to support the feet of the bridge. As an arch, the natural reaction of the structure under load was to try and push outwards. To resist these thrusts without having to tie the base of the arch together or carry over heavy weights in our luggage, these pedestals contained hidden compartments to conceal bottles of water which were procured on-site and provided the necessary ballast.
This being a conference for engineers in Denmark, it was a foregone conclusion that the train the bridge would carry should be made out of LEGO. The train in question came with a seven-speed remote control, however to avoid having to manually drive the train for two days straight it also fell to RCD to automate this by hacking the controller. The rotary dial which controlled the train’s speed produced different signals when turned clockwise or anticlockwise – instructing the train to accelerate and decelerate. By hooking up these contacts to an Arduino Uno board programmed to mimic these impulse patterns it was possible to control the train’s movements programmatically and have it moving backwards and forwards across the bridge without human intervention. Unfortunately several key wires were damaged in transit, requiring some frantic (but ultimately successful) repair work with a borrowed soldering iron the day before the conference.
Besides that, the bridge made it to Copenhagen without damage and was erected successfully at the conference. It proved very popular with the conference attendees, becoming packed with business cards by the end of the second day and successfully demonstrating the capabilities of computational design and collaboration to the wider business.
“Our team was made up of people with different skills sets and backgrounds, who were unified by a desire to create something unique. The bridge was a success because all team members contributed their technical expertise, yet listened to and challenged each other to continually improve and refine the design.
This project shows that having the right mix of people with a passion for a common goal can generate great design in a short period of time.”– Sarah Ord, Project Manager
“The Transport and Buildings teams collaborated seamlessly, bringing our respective strengths together created a more complete and superior design
“The use of parametric modelling and rapid prototyping and manufacture released the team’s time to concentrate on the creative design of the bridge through swift iterations. Designing and building the bridge in one month would not be possible without this approach”– Ollie Wildman, Director
“I worked on the structural analysis of the bridge ensuring that the design was robust enough to stand and carry the applied loads. It was great to have worked on such an innovative project and of course it could not have been done without this amazing and passionate team. Overall it was a brilliant experience and I am looking forward to work on similar stuff in the future!”– Neophytos Yiannakou, Bridge Engineer
“Parametric modelling has enabled quick optimisation and adjustment of the bridge geometry, making it easier to model and analyse. In a short period of time we were ready to print and test a first prototype, which has been key to meet the project deadline
“It has been a wonderful experience to design and actually build the bridge with such a diverse and motivated team. It is in projects like this where you realise the potential of combining different disciplines.”– Xavier Echegaray Jaile, Bridge Engineer
The complete bridge is now on display in the reception area of Ramboll’s London offices at 240 Blackfriars Road.
From January 2017, Imperial College London will be running an evening course on Parametric Engineering, co-taught by RCD lead Paul Jeffries. The course will cover the application of Rhino and Grasshopper for computational design within an engineering context and is open to anybody in full time education or academic employment. To apply contact Simply Rhino.
If you’ve arrived at this blog, you will probably have had some exposure to the concept of ‘computational design’. You may also have heard some of the related terms that fall under this heading – ‘parametric design’, ‘algorithmic design’, ‘generative design’ and so on. As computational design is still a relatively young and evolving field the meanings of these terms can be a little vague and are used by different practitioners in different ways. This article presents the vision of computational design that we have in Ramboll and the role that we see it having in the future of the industry. This is what *we* mean by computational design.
But, before we can answer the title question we need to first answer another – what is design?
Even within a single discipline, we might divide the process of delivering a project into two – the mental and the physical. In the former category we have the cerebral work that goes into a design – generating ideas, understanding requirements, thinking (and talking) through problems and deciding on the fundamental principles that go into forming ‘the design’. But this cannot stay a purely ephemeral undertaking – we as designers also need to test our ideas and communicate them to our clients and colleagues and for this we must engage in a range of more tangible activities – performing calculations, writing documents, producing drawings and models and so on. These are not merely end-products, however – they are integral to producing a better understanding of the problem we are trying to solve and the implications of our assumptions in solving it. There is thus an interplay between the mental and physical sides of design. The process as a whole is highly iterative, with many embryonic design options dreamt up, examined and refined or discarded on the way to the ultimate solution.
Recently, computers have been increasingly used as a method of production, to the extent that the second half of the above equation might often be termed ‘virtual’ rather than ‘physical’. Whereas previously we would have produced drawings by hand, we now more commonly draw on the computer using CAD (Computer-Aided-Design) packages such as AutoCAD and Rhino. Whereas in the past we would have had to physically construct an architectural model to see what a project looked like in 3D, we can now build and view a virtual 3D model, perhaps with additional detailed information embedded into it. Whereas we would have had to perform engineering calculations by hand we now have a plethora of software packages available to perform analysis and run through standard calculations on our behalf.
These are some of the ways in which computers are now used in design, but is this what we mean by computational design?
These technologies augment the process of design to make it more efficient, but they do not represent any fundamental change to the process itself. The first generation of CAD software set out to replicate as closely as possible the previously existing paradigms – they swapped out the mechanical pencil for the mouse and the eraser for the delete key but otherwise the experience was maintained. To draw a line, you press down and move your hand from start to end. This was deliberate and, to an extent, necessary during the first transition into the virtual world, but in treating a computer as merely a replacement for a sheet of paper the true power of computation was overlooked.
Computers are not inanimate objects. They are machines of logic and process. They can think; not quite in the same way we do but in a way which is certainly compatible. That means that they can be integrated not only with the physical aspects of the design process but with the mental ones as well.
A (good) design is a fundamentally logical construct. Every aspect will have some reason to be the way it is, whether that is structural, functional, aesthetic or some combination of the above. Walk into the office tower of your choice, for example, and you are likely to find that the columns which support the building are not arranged randomly – they will be evenly-spaced and follow a regular grid. This is done to make the structure more efficient, easier to build and to allow for standardisation of components. Where columns deviate from this grid there will likewise be good reasons for that to be the case – perhaps to keep an auditorium space column-free, perhaps to allow enough clear space for access to be provided for large vehicles, perhaps to better support large loads from above. Each column will have an underlying logical process determining its placement.
Traditionally, it would be for humans to both decide upon this logic and then work through it to determine the arrangement it suggested, drawing or modelling the result. But this second stage is well within the capabilities of the computer, which is after all nothing more or less than a machine for the evaluation of logical processes. If the human can describe the principles driving the design in a form that the computer can understand – i.e. as an algorithm – then the computer can begin to take on a much larger role in the design process, becoming not just a recipient of data but also a generator of it, creating the design representation from the rules the designer has set. This shift is what demarks Computational Design as distinct from simply using computers in a more traditional design exercise.
In brief; Computational Design is a change in the medium of design expression from geometry to logic.
There are a number of advantages to this approach; firstly being that the geometry of the design tends to be changed far more often than the logic. As a structural engineer, I may want to try out several different arrangements of the column grid in order to find the frame that best fits the geometry and construction type of the project. I am unlikely, however, to discard the principle of using a regular grid altogether. If changing the grid means having to redraw every single column position, or perhaps even having to fully recreate from scratch whatever analysis model I am using to make my assessment, that is going to limit the number of options I can feasibly examine (and make me far more likely to stick with whatever I first came up with). If changing that grid merely means adjusting a few input parameters of my generative model and having everything else done for me by the computer then I have far more freedom to explore the design space, find a more optimal arrangement and to adapt to external changes and new information introduced later in the design process. I can, in short, come up with a better design.
Leaving the resolution of the design logic to the computer also removes the restriction that said logic must be resolvable by humans. When rules begin to combine with one another their effects can sometimes be hard for the human brain to visualise. A fractal image, for example, is typically generated by very simple operations repeated over and over and over again, but while the rules may be easy to understand it can be very difficult to anticipate the geometric result without prior experience. So too with buildings, the many competing design drivers of which are often dealt with through simplification and convention far more than they are by optimisation. Computational design allows us to break through these barriers and produce responsive virtual models to do what brainpower alone cannot.
Computational design is an excellent means of dealing with complexity, whether that complexity is caused by the interaction of the factors we have control over or the uncertainty surrounding the factors we don’t. Traditionally this approach has been applied mainly to niche projects whose obvious visual complexity demanded it – buildings with highly sculptural forms, intricate facades and so on that would be next to impossible to design through any other means. However, all projects are complex in their own way, and can benefit from automation to handle that complexity. At Ramboll we recognise this, and so are working to make computational design technology and expertise a more deeply embedded and mainstream part of our design process across all types of project.