Model and modeling between objective and subjective (examples / 1)
The application architecture of digital models based on parametric functions (in the case of genetic algorithms and in particular the so-called "search algorithms") involves increasing the fusion of the design phase with the computer simulation of the object, the Because the architect can use the software directly to generate spatial forms and unpredictable at first, rather than specifically design them and then test them through the use of models. It starts from premises
therefore largely objective, linked to specific and parameterizable building already required at a preliminary stage and, through computer simulations, we produce different coherent solutions. Choosing the best solution, whatever the criterion, then is left to the criticism, so subjective, the designer, who, through the manipulation of the initial data can, if deemed necessary, to restart the whole process.
Although, therefore, also the choice of initial parameters and the ordering according to the priorities of the latter operation is somewhat subjective, the fact remains that the designer is always concerned to objectify their own decisions, the desire to make shared.
using, for example, the computer models to integrate within structural calculations of the design process, can produce numerous formal assumptions of the same building, and have been verified by the static point of view from which you can find the most effective combat loads (possibly with less waste of material) or simply the most suitable from the formal point of view. The draft
Toyo Ito and Cecil Balmond for the Serpentine Gallery Pavilion 2002, for example, is based on a volume defined by a uniform enclosure formed by the intertwining of a number of profiled steel plates that make up a seemingly random pattern. In fact, the structural organization is based on a generative algorithm set to a square that rotates and expands.
At the base of the algorithm is a structural view according to which the network more efficient to cover a square or rectangular is joining the midpoints of adjacent sides, and the result is a series of squares inscribed within each other and rotated 45 degrees, the sides of which are extended to the perimeter of the roof and then folded to form the lateral side of the building.
The designer, therefore, directly involved in the choice of the criterion by which to bring the generative algorithm, which, in this case, being a structural view points to be objectified.
The calculation process is iterated several times entering and modifying an asymmetry, which also takes into account point near the midpoint, thus enabling infinite variations on the envelope, depending on the value of the asymmetry and the number of repetitions of the calculation.
The ability to calibrate the asymmetry is the hallmark of which is built around this model computer and a tool that makes it easily manageable.
All variants, resulting from a calculation that itself has a structural value, are almost equivalent from a static viewpoint.
The choice of the final solution is thus linked to other considerations (in this case especially formal) where you need a critical assessment of the project and why, once again, totally subjective. The
Flux Structures of Mutsuro Sasaki is another useful example of computer modeling, which transforms the objective data of the project architecture in which the subjective component is still very strong. Just look carefully
two architectures arising from the application of the same type of model: the Cultural Center of Arata Isozaki and kitag crematorium Kakamigahara, Toyo Ito.
Again, the final appearance of the building, although in its general lines established by the architect at an early stage is the result of the calculation.
If we confined ourselves to this aspect, it follows that the two buildings would tend to be substantially equal. In fact, although the coverage to produce a strong similarity curvature free, can be detected many differences (of which we will now not) involved in the formal selection criteria of the two architects.
Again, the computer model then becomes a means of exploring spatial and formal results of which are unforeseen at the beginning and guarantee the project will meet the final outcome, but which is guaranteed at all stages of consistency with the criteria for starting .
In other words, the architect has total freedom to manipulate the model varying the initial input, ensuring you always get plausible results in terms of static.
management model el 'stop the process at a certain stage of its operations are therefore fully exposed, which provide innovative architecture and fully consistent with the research project of their authors.
Tuesday, June 24, 2008
Monday, June 23, 2008
Backyardigan Cakes In Toronto
male and female models between objective and subjective (assumption).
The advent of information technology in architecture has gradually replaced the idea of \u200b\u200bmodel as a means of testing hypotheses with a deductive approach in which computer modeling plays an active role in the genesis of the project, as is the 'process' becomes the focus of the research project.
In particular, the ability of computer models to collect and process quickly a large amount of data allows for very efficient simulations of the behavior and appearance of the building will be built once.
This kind of modeling offers, in fact, the possibility of objective data in the design process, such as geometric transformations based on mathematical calculations, structural, lighting requirements, so that they directly affect the final result.
The computer model has two special features refers to the ability to react to the information provided by the designer ( input): the ability self-healing model based on changes in input and in some cases, the ability to self-organization and autoaccrescimento ( autopoiesis).
Following the principles of 'action-reaction' and of 'feedback', the design process feeds on itself in a continuous circular, the evolution of which the design synthesis represents only a snapshot, a freeze-frame.
The relative simplicity of use in some modeling software must not, however, to imply a shift from the role of the architect active figure in the design process to a simple selection of shapes generated by computer. This vision, in fact, would full approval of results and consequent low efficiency of the modeling tool, which would become useless if we will simply explore the formal possibilities generated independently.
is necessary, therefore, that the project as an individual act, so subjective, to intervene to ensure the necessary independence of thought of the author, as part of initiation or destabilization of the mathematical process, by its nature objectifying.
Although each act of design of living continues interrelationship between objectivity and subjectivity of the process data, one wonders how and in what way the computer model is raised in these two polarities.
Our research investigates, through the examples that follow, two possible approaches: the first identifies the computer model as a tool of objectification of subjective assumptions and the second, conversely, analyzes it as a research tool subjective from objective data.
The difference between the two approaches depends primarily on the manner in which they are parameterized design data and the choice of data and priorities are assigned to them in relation to the ultimate goal that the designer intends.
Carlo Gamboni
Marco Marrocchi
The advent of information technology in architecture has gradually replaced the idea of \u200b\u200bmodel as a means of testing hypotheses with a deductive approach in which computer modeling plays an active role in the genesis of the project, as is the 'process' becomes the focus of the research project.
In particular, the ability of computer models to collect and process quickly a large amount of data allows for very efficient simulations of the behavior and appearance of the building will be built once.
This kind of modeling offers, in fact, the possibility of objective data in the design process, such as geometric transformations based on mathematical calculations, structural, lighting requirements, so that they directly affect the final result.
The computer model has two special features refers to the ability to react to the information provided by the designer ( input): the ability self-healing model based on changes in input and in some cases, the ability to self-organization and autoaccrescimento ( autopoiesis).
Following the principles of 'action-reaction' and of 'feedback', the design process feeds on itself in a continuous circular, the evolution of which the design synthesis represents only a snapshot, a freeze-frame.
The relative simplicity of use in some modeling software must not, however, to imply a shift from the role of the architect active figure in the design process to a simple selection of shapes generated by computer. This vision, in fact, would full approval of results and consequent low efficiency of the modeling tool, which would become useless if we will simply explore the formal possibilities generated independently.
is necessary, therefore, that the project as an individual act, so subjective, to intervene to ensure the necessary independence of thought of the author, as part of initiation or destabilization of the mathematical process, by its nature objectifying.
Although each act of design of living continues interrelationship between objectivity and subjectivity of the process data, one wonders how and in what way the computer model is raised in these two polarities.
Our research investigates, through the examples that follow, two possible approaches: the first identifies the computer model as a tool of objectification of subjective assumptions and the second, conversely, analyzes it as a research tool subjective from objective data.
The difference between the two approaches depends primarily on the manner in which they are parameterized design data and the choice of data and priorities are assigned to them in relation to the ultimate goal that the designer intends.
Carlo Gamboni
Marco Marrocchi
Tuesday, June 10, 2008
Long Hairstyles With China Bangs
Flux Structures


The Flux Structures [1] are a useful model for the application of algorithms and parametric functions in relation to the development of contemporary architectural form and structure, with the aid of computer systems.
theorized and developed by engineer Mutsuro Sasaki Japan and applied for the first time in the Cultural Center of Arata Isozaki kitag (images above), the Flux Structures from research facilities of fluid and organic, made in a rational way using the principles of evolution and self-organization.
Through these models, the computer is able to quickly generate the shape of the shell with curved surfaces free, guaranteeing maximum structural rationalization efforts and finally replacing the traditional experimental methods, yet conceptually similar to those used, for example, by Gaudi.
The traditional practice of requiring, in fact, an initial planning stage for subsequent attempts, during which the surface is expressed by the architect imagined using curvilinear features, and then proceed to the analysis of effort and strain with the classical methods.
Obviously a change in the organization with consequences on the structure of the plant normally shows the process at an early stage.
Once established, the algorithm can generate the structure, however, the computer can calculate and display numerous variations on in minutes, obtained by changing the design parameters: now you can select the most interesting of those obtained.
Through these models, the computer is able to quickly generate the shape of the shell with curved surfaces free, guaranteeing maximum structural rationalization efforts and finally replacing the traditional experimental methods, yet conceptually similar to those used, for example, by Gaudi.
The traditional practice of requiring, in fact, an initial planning stage for subsequent attempts, during which the surface is expressed by the architect imagined using curvilinear features, and then proceed to the analysis of effort and strain with the classical methods.
Obviously a change in the organization with consequences on the structure of the plant normally shows the process at an early stage.
Once established, the algorithm can generate the structure, however, the computer can calculate and display numerous variations on in minutes, obtained by changing the design parameters: now you can select the most interesting of those obtained.
[1] See M. Sasaki, Flux Structure, ed. Toto, Tokyo, 2005.
or M. Sasaki, Flux Structures, in "Casabella" n. 752, February 2007, pp. 26-29.
Subscribe to:
Posts (Atom)