A simple definition
The finite element method is a numerical tool which allows engineers to bypass the complex mathematics associated with traditional analyses of components and structures and to concentrate on the engineering of the design - to investigate behaviour, to optimize; to examine alternatives. Furthermore, the gross assumptions inherent in traditional engineering calculations can be avoided, if necessary, and realistic solutions can be achieved in timescales which cannot be matched by alternative approaches.
This engineering description of the finite element method outlines the essential features of finite element analysis and also provides some of the reasons for the present popularity of this extremely versatile and powerful tool. Engineers can use this powerful computer-based tool for modelling and simulating components and systems subjected to a variety of loads and environments and to ask what if?
The finite element method is a numerical method for solving a system of governing equations over the domain of continuous and discontinuous physical systems.
An engineer will perhaps find this definition for the mathematician less inspiring, but enlightening none-the-less in terms of understanding just exactly what FEA is.
The essential characteristic of finite element analysis for companies today is that the availability of a commercial general purpose finite element system, which in itself may have taken man centuries of development effort, has the potential to dramatically improve in-house engineering design and development. There must off-course be a commitment by senior management to ensure that the implementation of such a system is supported in the widest sense. Today's software still requires an investment in training if a company is to realise the full benefits that such an approach can bring. Finite element analysis has also spread from its traditional base in the so-called right first time industries and its role has also widened from its traditional diagnostic base to embrace concept design and evaluation. It is now well and truly a tool utilized in mainstream design. Detailed information on trends relating to this technology is provided at this link.
A modern finite element system is rather like a toolbox. Contained in the toolbox are a wide variety of building blocks to facilitate the study of a wide range of problems. The skill of the engineer is in the selection of the correct tool for the job and in the use of the various tools. The practising engineer will, I am sure, recognise the essential additions to the basic toolbox, illustrated in this figure. Move your mouse over the image to reveal all!
Outline of the analysis process
Almost all finite element analyses follow the steps shown in the figure below.
The formulation of the computer model, from an understanding of the requirements of the analysis, requires good engineering skill and judgement if adequate results are to obtained. This stage generally involves significant approximations and assumptions. The goal of many finite element analyses is generally adequate results at reasonable cost (within budget and timescale). The formulation of the problem, as it is traditionally called, is critical to the whole procedure. New generations of finite element systems will certainly widen the scope of analysis types, will make the creation of the computer model easier and will also hopefully be able to put an accuracy figure on the results. It is an unfortunate fact of life that the computer does not provide an indication of how close the approximate answers obtained, are to reality itself. Any accuracy estimates available will invariably only provide an indication of convergence for the idealised model. Numerous accuracy estimates have been developed over the years, which can be used to give some indication as to the possible occurrence of numerical error. These are useful but do not provide estimates in relation to reality – only with regards to the computer idealisation itself. In the solution phase, however, it must be realised that the approximations which are introduced in creating the computer model in the first instance cannot in general be quantified, except through recourse to physical experimentation. If an engineer imposes the incorrect boundary conditions or selects the incorrect type of building block, then all that can be assured is that we have the correct answer for the wrong problem!
A more detailed outline of the analysis process is shown in following figure.
The outline assumes that the finite element model is being based on a geometry master model. However, some finite element analyses are carried out using a so called bottom-up approach, rather than the top-down method, which utilizes a geometry master model as the basis for the construction of the finite element model.
With the bottom up approach, a separate geometry model is not created and the finite element model is built from scratch, with finite element building blocks. This approach is becoming increasingly unusual.
The analysis requirements shall dictate, to a large extent, the type of elements or building blocks used in the finite element model and the level of detail necessary to produce the required results. The engineering assumptions which the analyst is prepared to accept, shall also have a significant bearing on the finite element model. At the outset of the modelling process decisions will have to be made as to how much of the real world will have to be included in the model and what level of detail is necessary. These decisions, along with other engineering assumptions, such as plane stress, plane strain, linear elasticity, small displacements, axisymmetry etc, will indicate the dimensionality of the necessary elements or building blocks e.g. one dimensional beams, two dimensional membranes, three dimensional shells / solids or a combination thereof. Boundary conditions should be selected to ensure that the behaviour of the computer model closely approximates, as far as is necessary, the behaviour of the actual component or structure. It should also be apparent that the accuracy of the final results is also inherently linked with the accuracy of the material and loading data supplied.
Specifying the material data for any analysis is normally a straightforward task. However, there are situations where the material data may contain considerable approximations and the accuracy of the final results should be viewed with this fact in mind. The following list illustrates situations where the accuracy of the material input data requires particular consideration:
Composite material
Non-linear material and damage
Fatigue and fracture
Time dependent material
The above list is hardly exhaustive and in the author's experience the majority of real analyses also involve considerable approximations with respect to loading in particular. Once again, engineering experience is required to ensure that the situation being examined represents a conservative one in design, or is suitably representative in a diagnostic analysis.
The actual creation of the finite element model requires experience in the use of finite element systems if adequate results are to be achieved at reasonable cost. It is interesting to note that it is this part of the whole process that many experienced analysts eventually find tedious and boring. The idealisation phase and the engineering assessment of the results, on the other hand, invariably remain rewarding and fascinating.
An understanding of how the component or structure behaves will allow a reasonable attempt to be made at the creation of the model. When one considers that convergence studies are not generally carried out and decisions are commonly taken on the results from a single analysis, then it becomes clear that there is considerable scope for error. The necessary skills and experience required for effective analysis are discussed in the following section, but it is appropriate to note that developments in technology are also having an impact in this area.
The creation and validation of the finite element model has become one of the most time-consuming and costly parts of any project. It should therefore be obvious that a good pre and post processor to handle the model and results is essential for companies active in this area. Increasingly, geometry is provided from a 3D CAD system. This geometry is invariably produced for manufacturing purposes. The geometry will require modification before it can be used for analysis purposes. Further details of the CAD/FEM Interface may be found by following this link.
It is interesting to consider the fact that formerly, the most expensive part of any project, in many instances, would have been the actual computing costs of the solution. Today, for the majority of analyses, this cost has decreased in significance to such an extent that it is normally far less than that associated with the labour in building the model. This trend is sure to continue with the developments in the technology.
The actual solution phase is carried out entirely by the computer and in reality represents a welcome break from the workstation, for the analyst. However, in many instances, lunch breaks have developed into coffee breaks due to the increasing power of the computer!
The assessment of the model integrity invariably involves much engineering common sense and should not be entrusted to the inexperienced engineer. This stage in the process is essential in providing some confidence in the results. There is little point in carrying out engineering assessment of any significance until the assessment of the model integrity has been satisfactorily completed.
Finally, it is worth noting that although finite element systems will provide answers to numerous decimal places, the final accuracy of the results should be assessed after due consideration has been given to the accuracy of all input data and to the level of approximation used.
Skill and experience
The various steps involved in a typical finite element analysis are listed in the following figure.
Also shown in this figure is a breakdown of the time spent on the various steps for a typical analysis of a pressure vessel component. It may be observed that the large majority of time spent relates to the creation and verification of the finite element model. It should be recognized, that developments in technology such as features-based modellers, automatic and adaptive meshing, have great potential to impact on this time-consuming and often tedious phase of the analysis.
Shown in the following figures, is a breakdown of the analysis process, showing the different types of knowledge and experience necessary for a satisfactory completion of the analysis.
It may be observed that the early idealisation stage in fact requires very little knowledge of finite element analysis. Likewise, at the end of the process, the requirement is again for engineering skill rather than specialist finite element ability. The phases in the middle of the process are those which are effectively being de-skilled due to developments in technology. It is likely therefore that new generations of finite element systems shall allow engineers to concentrate on the engineering, with less emphasis on the black-art of finite element analysis.
The developments in finite element systems, discussed in the following section, should result in a further spread of the technology into new industries and into the hands of skilled engineers, who are not necessarily specialist finite element practitioners.
A brief history
The mathematical roots of the finite element method can be traced back hundreds, if not thousands, of years. The essential concept of discretisation was demonstrated by Eudoxus of Cnidus some 2400 years ago. Using the method of exhaustion he was able to estimate areas by replacing the original complex region by an assemblage of simpler sub-regions of known area.
The use of trial functions, another essential ingredient of the finite element method, was introduced almost 200 years ago. At the end of the 18th century Gauss used global trial functions. However, in 1943 Courant developed the idea of piecewise continuous trial functions. By extending the use of trial functions from the global to the local level, Courant laid the last foundation stone for the development of the finite element method as we know it today. During the 1940's, aircraft designers were using analysis methods which were an early form of the finite element method. During the 1950's key contributions to the technology were made by Levy and Argyris amongst others. It was not until 1960, however, that Clough first used the term finite element.
The proceedings of the 1st Conference on Matrix Methods in Structural Mechanics , clearly demonstrates the diversity of application and the potential of this new numerical tool. The first textbook on the finite element method is generally attributed to Zienkiwicz and Cheung in 1967.
The development of this powerful and versatile engineering tool has also been inextricably linked to accompanying developments in hardware and software.
Today's finite element systems owe as much to developments in this area as they do to the developments in the underlying theory.
During the 1960's and 70's the publication of new element formulations was common and the rapid proliferation of element types soon led to commercial finite element systems with element libraries which covered most common problem areas.
However, the point of diminishing returns has been reached in this area and the publication of new and better element formulations is rare. The displacement formulation in general and the isoparametric formulation in particular has achieved dominance.
There is now a range of general purpose finite element systems available and the trend is towards ease of use and integration with other CAD functions and systems.
Developments in finite element systems should result in a further spread of the technology into new industries and into the hands of skilled engineers, who are not necessarily specialist finite element practitioners. The importance of the new generation of finite element systems in allowing designs to be assessed accurately in realistic timescales cannot be underestimated.