(par 5.1.3.2) Four phases of industrial revolution – Phase three

A VPN is an essential component of IT security, whether you’re just starting a business or are already up and running. Most business interactions and transactions happen online and VPN

https://www.thenbs.com/knowledge/four-phases-of-industrial-revolution-phase-three

By Jess Sharman  10 January 2018 |

According to The Future of Jobs 2016 report by the World Economic Forum, we are in the midst of a fourth phase of industrial revolution, with technology rapidly developing in areas like 3D printing, robotics and artificial intelligence. In this four part series, we’ll explore the history of the industrial revolution so far and take a look at how what’s being developed now will change the way we work and live.

Phase three: Going digital

History of the computer

In 1837, English mathematician Charles Babbage invented the Analytical Engine, the first mechanical computer. While opinions vary, most consider Babbage’s invention to be the foundation on which modern computing was built. Using punch cards to deliver information, the Analytical Engine was designed to perform arithmetical calculations. Unfortunately, funding problems prevented Babbage from building his machine. That task was left to Babbage’s son Henry, who completed a portion of the machine, and Ada Lovelace, a British mathematician who completed its program.

The first electro-mechanical binary programmable computer was built in 1936-1938 by German Konrad Zuse, which he called the Z1. Interestingly, Zuse went on to build the Z3 (1941), which he used to perform structural calculations and analysis in his work as a civil engineer.

First generation: Vacuum tubes and single task processing

From 1937–1946, we saw the first generation of computers being built and used. These computers were large and heavy, taking up whole rooms. They used vacuum tubes for processing and could only perform a single task. Key computers of this era are the digital ABC (Atanasoff-Berry Computer) developed from 1937–1942 and the electric programmable Colossus built in 1943. The ABC was developed in America by a professor and his student. It used over 300 vacuum tubes and wasn’t programmable. The Colossus was of British design, created to assist code breakers decipher encrypted German messages during the war.

The first general purpose computer was the Electronic Numerical Integrator and Computer (ENIAC), which was built from 1943–1946 at the University of Pennsylvania. The ENIAC took up 1,800 square feet of space, used 18,000 vacuum tubes and weighed approximately 50 tons.

There is some controversy around which of the two American designs is the first electrical digital computer. In 1973, a US federal judge ruled in favour of the ABC; however, many still consider the ENIAC to be the first because it was fully functional.

Second generation: Transistors and programming languages

British-made EDSAC and Manchester Mark 1 were two of the first computers that could store programs. UNIVAC 1, the first commercial computer, was introduced in 1950. It was not only capable of storing a program, but it could then run it from memory. Also in 1950, Konrad Zuse’s Z4 was sold as the first commercial computer.

In 1953, the first IBMs (International Business Machines) made their debut. Their first offering was the 701, which was the first commercial scientific computer. In 1954, the first transistor (TRADIC) computer was built by engineers with support from the military. (Transistors had been introduced in 1947, originally replacing vacuum tubes in the telephone system. Unfortunately, while transistors were smaller, faster and more efficient, they were also considerably more expensive.) TRADIC used 800 point-contact transistors and 10,000 germanium crystal rectifiers. It was approximately three cubic feet in size; this is in comparison to the 1,800-square foot ENIAC.

The first desktop computer was the Programma 101, which made its debut at the New York World’s Fair in 1964. Invented by Pier Giorgio Petrotto, approximately 44,000 Programma 101s were sold at roughly £2,500 each.

During the 1950s–1960s over 100 programming languages were developed, and computers developed memory and operating systems. Tape and disk drives allowed for portable storage, and printers provided a means of output.

Third generation: Integrated circuits, shrinking form factor, personal use

The next wave of computers came about due to the invention of the integrated circuit, not only making computers more powerful and more reliable, but allowing for a much smaller form factor. These computers could multi-task, running several programs at a time.

In 1980, MS DOS (Microsoft Disk Operating System) was introduced and IBM introduced their first personal computer the following year. Apple followed with the Macintosh and the now-standard icon driven interface. While IBMs and Macs were available for home and office use, their cost was prohibitive for much of the population. This changed with Sir Clive Sinclair’s release of the ZX80, the first home computer marketed for under £100. His later release, the ZX Spectrum, became the UK’s fastest-selling computer.

In the 1990s, we were introduced to the Windows operating system and the World Wide Web. During this decade, the internet became a normal part of working life and, by the end of it, a normal part of home life for many.

During the 2000s, digital made its way to the developing world, and television signals moved from analogue to digital. By 2010, over 70% of the world had mobile phones, and cloud computing allowed these handheld computers to not only meet but exceed PC capability.

Computers in construction

Computer-aided design

CAD technology can be traced back to two major events: Dr Patrick Hanratty developing PRONTO, the first commercial numerical-control programming system in 1957, and MIT’s Ivan Sutherland’s creation of SKETCHPAD, a man to machine “graphical communication system” that illustrated the feasibility of computer technical drawings.

In the 1960s, CAD application was focused on the aircraft industry; however, by the 1980s, they were being used to improve the architectural drawing and design process.

The first CAD systems were basically digital drawing boards. Over time, the technology moved from providing basic 2D representations to 3D models, with modern CAD systems being able to create, simulate, and provide analysis for a wide range of construction scenarios throughout the lifecycle of a building – from initial design to maintenance and refurbishment.

Today, CAD (CADD, CAE, CAM) systems are well-used and commonplace. Where once a CAD system required a specialised and expensive UNIX-based workstation, now all you need is a PC or Mac that meets the software’s system requirements. There is even an online app called Tinkercad that allows you to create and print 3D models.

As for its future, there are several factors driving CAD development. This includes:

  • Customisable software: Instead of complex and bloated programs filled with features most of us will never use and lacking those that we need, drafting and design software will be delivered on a platform that caters to simplicity at its base level and allows users to easily find and test bolt-on and third-party applications to determine what works best for them.
  • Specialised offerings: In tandem with general purpose applications that can be customised, CAD will also continue to be offered as specialised versions for use in specific fields.
  • Subscription model: This goes together with the above. The subscription model will allow for more flexibility in what’s offered, and vendors will be able to better cater to what an organisation and/or user needs. It also facilitates continuous adoption of future technologies, with a user’s tools advancing as they do.
  • Cloud-based services: This is how CAD systems will help facilitate the industry-wide move towards better collaboration. By moving to a cloud based system, many of the current challenges that face multi-faceted teams when trying to work with a CAD model will no longer exist. Collaboration will be streamlined. Tools will be connected rather than running in parallel. File formats and software versions won’t be an issue. Users will be able to work simultaneously on the same model and view the history of that model and how changes made impact the end product.
  • Standardised format: We’re seeing this now, and we will continue to see a movement to a single, standardised format across applications. Most likely, this will be an XML based format.
  • Rich in data: Future CAD will provide a wealth of data that will be readily accessible to those who require it whilst remaining proprietary and protected where/when needed.

Geographical information systems

Geographical information systems (GIS) are used to capture, map, manage, visualise and interrogate data as it relates to a location in time and space. Applications include construction scheduling, facilities and service management, property analysis, demographic analysis, and environmental impact assessment. In the design phase of a project, GIS can provide information about geography, potential impacts, and emergency planning. During use, GIS can aid with logistics management and building/site performance. The key to successful GIS is, of course, data and process standardisation, so that information from multiple sources can be combined.

BIM

Building Information Modelling (BIM) brings together individual objects to create a single model that presents all of the inherent properties of a building, and then stores it in a database. Most current BIM models are 3D with orthographic 2D plans, but eventually BIM will provide more than that – offering 4D, 5D, and 6D information (more on this later).

The history of BIM is a rich one, with innovators from all over the world contributing to the concept over the past 50 years.

Conceptual origins

The conceptual origins of building information modelling can be traced back to as early as 1962 and a report entitled “Augmenting Human Intellect: A Conceptual Framework” by Douglas Engelbart. In it, the stage is set for the future of architectural design:

“Now he begins to enter detailed information about the interior. Here the capability of the clerk to show him any view he wants to examine (a slice of the interior, or how the structure would look from the roadway above) is important. He enters particular fixture designs, and examines them in a particular room. He checks to make sure that sun glare from the windows will not blind a driver on the roadway, and the “clerk” computes the information that one window will reflect strongly onto the roadway between 6 and 6:30 on midsummer mornings.”

It was during this same time period that the technological equivalent of GIS was being developed. Christopher Alexander, one of the GIS design researchers, promoted the idea of object-oriented programming; however, he realised the idea would have to wait until the needed technology caught up.

In 1975, Charles Eastman published a paper entitled “The Use of Computers Instead of Drawings in Building Design”. In it, Eastman describes a system that is quite similar to what we now know as BIM. He talks about parametric design and 3D representations of objects which could be joined together to form a model that could then be viewed from varying angles. He also suggests that there would need to be a massive database created in order to hold everything. He calls this system the “Building Description System”.

Early development

In the 1980s, several BIM-style systems were being developed, including RUCAPS (Really Universal Computer-Aided Production System), which was used during the Heathrow Airport Terminal 3 renovation. RUCAPS is considered to be the forerunner to current BIM software.

Interestingly, one of the major impacts on how BIM developed came as the result of smuggling. In 1982, Hungarian Gabor Bojar was illegally smuggling Apple computers behind the Iron Curtain in order to work on a software program that would eventually become ArchiCAD, the first BIM software available for the PC. In its original launch, the software was known as Graphisoft’s Radar CH (1984) and was designed for the Apple’s Lisa OS, but it was rebranded in 1987 and launched as ArchiCAD.

Other notable dates in BIM history include:

  • 1985, Diehl Graphsoft introduces Vectorworks, one of the first cross-platform CAD applications to introduce BIM capabilities.
  • 1988, Parametric Technology Corporation releases Pro/ENGINEER, the first marketed parametric modelling design software in BIM history.
  • 1993, Lawrence Berkeley National Lab develops Building Design Advisor, which uses a model to perform simulations and suggest solutions.
  • 1995, the International Foundation Class (IFC) file format enables data flow across platforms.
  • 1997, ArchiCAD 5.1 is released and includes the first file exchange based Teamwork solution, enabling more people to work on the same model at the same time.
  • 1999, Japanese firm Onuma uses the internet to allow virtual teams to be formed and work together on a database driven BIM planning system, thus paving the way for seamless cross-platform integration of parametric and BIM technologies.
  • 2000, Charles River Software gives us Revit, a revolutionary program that uses object oriented programming to create a parametric change engine, which allows for a time attribute to be added. Because of its ability to retain synchronisation between elements when one of them is adjusted, Revit has become the program of choice for BIM.
  • 2001, NavisWorks markets a 3D design review software called JetStream.
  • 2002, Autodesk acquires Revit and, in 2007, they acquire NavisWorks.
  • 2004, Revit 6 is released, which enables larger teams to collaborate and work in a single, integrated model software environment.
  • 2012, Autodesk develops Formit, an application that brings BIM capability to mobile devices.

BIM levels

Level 0 – This is the absence of BIM, where 2D CAD draughting is used to produce output via paper and digital prints and there is little to no collaboration. Most of the construction industry is well past this stage of development.

Level 1 – At this level, groups are using a mixture of 3D and 2D CAD. Data is shared electronically through a common data environment and models are not shared between project team members.

Level 2 – Information is shared between all parties in a collaborative effort. 3D CAD models are used but there may or may not be a single, shared model. A common file format is used for sharing design information, usually either IFC or COBie (Construction Operations Building Information Exchange).

Level 3 – Known as Open BIM, at this level users work with a single, shared model which is hosted in a centralised repository. The benefit of Level 3 is lowered risk in regards to conflicting information. Challenges at Level 3 include how to best address copyright and liability issues.

BIM dimensions

3D – Graphical and non-graphical information shared in a common data environment.

4D – Scheduling information is added to the project information model.

5D – The building information model also includes accurate cost information.

6D – At this level, the wealth of information within the model is such that it provides support during the facilities management and operational stage of the asset. Ideally, the 6D BIM continues to grow and develop throughout the asset’s lifecycle.

While BIM technology is still in its early days, adoption is growing. Slowly, BIM is replacing CAD as the preferred medium for AEC projects. Of course, there are still holdouts, with some in the industry unwilling or unable to adapt to this new way of working. To overcome those obstacles, EvolveLAB’s chief technology officer and director of BIM services Bill Allen asserts that there needs to be a step-change in

  1. How we process information. Many professionals are still unsuccessfully trying to combine old methods with new technologies, which results in “data waste”.
  2. How our companies create their policies. According to Allen, one of the biggest obstacles is the legal department and their reluctance to keep up with technology. Successful BIM requires information sharing, which the legal department often blocks. Allen suggests that an electronic release form could aid in getting past that roadblock.
  3. How our people view the AEC BIM culture and their willingness to embrace change. While technology is at its heart, the real success of BIM begins offline, with the people.

“If we can learn to put our egos aside, band together as teams, and work with instead of against each other, that would be very helpful,” he says. Simple social interactions—conversations about family and football, for instance—are a good place to start: “If you can connect with them on an emotional level, people are more willing to help you out.”

The future of BIM

In Allen’s article, The future of BIM Will Not Be BIM and It’s Coming Faster Than You Think, he suggests that the future of BIM will “not be BIM”, but rather Building Information Optimization.

“Rather than manually drawing walls, doors, and columns for what we think is a good design, we will feed the computer “rules” instructing it to give us a building’s optimal footprint, structural load capacity, and thermal performance. Things that took months will be done in a day.”

Allen asserts that “BIM 2.0” will be generative design, with design optimised via automation and computers able to scrutinise thousands of potential design variations, shortlisting those that best meet project needs. According to Allen, “Computers are going to be able to design things so much more efficiently than humans can.” (3 Keys That Will Unlock the Future of BIM in Buildings)

This concept ties in with the assertion that we are now entering a fourth phase of industrial revolution – one where virtual/augmented reality, 3D printing and robots play a key role. But more on that next time.

krtadmin

krtadmin

Leave a Replay

Share

Recent Posts

Follow Us

Supporting Videos