Tuesday, October 28, 2008

Hyderabad Pearls Set Amarsons

As this market! Reflections

that rages everywhere, paintings that are sold to frightening figures, artists who call themselves such because someone did win them the prize usually small and sad ... Now turn around in the web, including community and sites that want to sell art ... But how much of this art is really nice, how much it's worth it, because it transmits emotion without cause, without leaving disgust, no shit. Are now rare in the representations that do not talk about him, the provocations, works of art of bad taste, and nothing more.
periodically observe a plethora of new artists who do not invent anything new but simply elaborate on something that already exists. In digital we have some great guru who still manages to convey emotions that still manages to impress, but in art "classic" as we know it all, sadness that we see around is disarming.
then truly hope that digital art takes the place of the material or, at least, helps the material to grow, to mature, to innovate.

Friday, October 3, 2008

Welcome Or Greeting And Church

late in the evening ...

becoming increasingly difficult to combine art with the web, more and more difficult to move between the meanders of the network looking for that flash of inspiration that allows us to create an application, a website, a slideshow that is "art", to all intents and purposes.
And that is because every case, the common standard to move in this medium, the network, requires us, we are blocked from view, for example, by visual content ... We can not touch, smell ... Remains a partial experience ... Also making art ...

I put the slideshow of my last two works ... hanging on the wall are another thing ... much better ...

Tuesday, June 24, 2008

How Can I Get Free P90x Cds?

Model and modeling between objective and subjective (examples / 1)

The application architecture of digital models based on parametric functions (in the case of genetic algorithms and in particular the so-called "search algorithms") involves increasing the fusion of the design phase with the computer simulation of the object, the Because the architect can use the software directly to generate spatial forms and unpredictable at first, rather than specifically design them and then test them through the use of models. It starts from premises
therefore largely objective, linked to specific and parameterizable building already required at a preliminary stage and, through computer simulations, we produce different coherent solutions. Choosing the best solution, whatever the criterion, then is left to the criticism, so subjective, the designer, who, through the manipulation of the initial data can, if deemed necessary, to restart the whole process.
Although, therefore, also the choice of initial parameters and the ordering according to the priorities of the latter operation is somewhat subjective, the fact remains that the designer is always concerned to objectify their own decisions, the desire to make shared.
using, for example, the computer models to integrate within structural calculations of the design process, can produce numerous formal assumptions of the same building, and have been verified by the static point of view from which you can find the most effective combat loads (possibly with less waste of material) or simply the most suitable from the formal point of view. The draft
Toyo Ito and Cecil Balmond for the Serpentine Gallery Pavilion 2002, for example, is based on a volume defined by a uniform enclosure formed by the intertwining of a number of profiled steel plates that make up a seemingly random pattern. In fact, the structural organization is based on a generative algorithm set to a square that rotates and expands.
At the base of the algorithm is a structural view according to which the network more efficient to cover a square or rectangular is joining the midpoints of adjacent sides, and the result is a series of squares inscribed within each other and rotated 45 degrees, the sides of which are extended to the perimeter of the roof and then folded to form the lateral side of the building.
The designer, therefore, directly involved in the choice of the criterion by which to bring the generative algorithm, which, in this case, being a structural view points to be objectified.
The calculation process is iterated several times entering and modifying an asymmetry, which also takes into account point near the midpoint, thus enabling infinite variations on the envelope, depending on the value of the asymmetry and the number of repetitions of the calculation.
The ability to calibrate the asymmetry is the hallmark of which is built around this model computer and a tool that makes it easily manageable.
All variants, resulting from a calculation that itself has a structural value, are almost equivalent from a static viewpoint.
The choice of the final solution is thus linked to other considerations (in this case especially formal) where you need a critical assessment of the project and why, once again, totally subjective. The
Flux Structures of Mutsuro Sasaki is another useful example of computer modeling, which transforms the objective data of the project architecture in which the subjective component is still very strong. Just look carefully
two architectures arising from the application of the same type of model: the Cultural Center of Arata Isozaki and kitag crematorium Kakamigahara, Toyo Ito.
Again, the final appearance of the building, although in its general lines established by the architect at an early stage is the result of the calculation.
If we confined ourselves to this aspect, it follows that the two buildings would tend to be substantially equal. In fact, although the coverage to produce a strong similarity curvature free, can be detected many differences (of which we will now not) involved in the formal selection criteria of the two architects.
Again, the computer model then becomes a means of exploring spatial and formal results of which are unforeseen at the beginning and guarantee the project will meet the final outcome, but which is guaranteed at all stages of consistency with the criteria for starting .
In other words, the architect has total freedom to manipulate the model varying the initial input, ensuring you always get plausible results in terms of static.
management model el 'stop the process at a certain stage of its operations are therefore fully exposed, which provide innovative architecture and fully consistent with the research project of their authors.

Monday, June 23, 2008

Backyardigan Cakes In Toronto

male and female models between objective and subjective (assumption).

The advent of information technology in architecture has gradually replaced the idea of \u200b\u200bmodel as a means of testing hypotheses with a deductive approach in which computer modeling plays an active role in the genesis of the project, as is the 'process' becomes the focus of the research project.
In particular, the ability of computer models to collect and process quickly a large amount of data allows for very efficient simulations of the behavior and appearance of the building will be built once.
This kind of modeling offers, in fact, the possibility of objective data in the design process, such as geometric transformations based on mathematical calculations, structural, lighting requirements, so that they directly affect the final result.
The computer model has two special features refers to the ability to react to the information provided by the designer ( input): the ability self-healing model based on changes in input and in some cases, the ability to self-organization and autoaccrescimento ( autopoiesis).
Following the principles of 'action-reaction' and of 'feedback', the design process feeds on itself in a continuous circular, the evolution of which the design synthesis represents only a snapshot, a freeze-frame.
The relative simplicity of use in some modeling software must not, however, to imply a shift from the role of the architect active figure in the design process to a simple selection of shapes generated by computer. This vision, in fact, would full approval of results and consequent low efficiency of the modeling tool, which would become useless if we will simply explore the formal possibilities generated independently.
is necessary, therefore, that the project as an individual act, so subjective, to intervene to ensure the necessary independence of thought of the author, as part of initiation or destabilization of the mathematical process, by its nature objectifying.
Although each act of design of living continues interrelationship between objectivity and subjectivity of the process data, one wonders how and in what way the computer model is raised in these two polarities.
Our research investigates, through the examples that follow, two possible approaches: the first identifies the computer model as a tool of objectification of subjective assumptions and the second, conversely, analyzes it as a research tool subjective from objective data.
The difference between the two approaches depends primarily on the manner in which they are parameterized design data and the choice of data and priorities are assigned to them in relation to the ultimate goal that the designer intends.

Carlo Gamboni
Marco Marrocchi

Tuesday, June 10, 2008

Long Hairstyles With China Bangs

Flux Structures




The Flux Structures [1] are a useful model for the application of algorithms and parametric functions in relation to the development of contemporary architectural form and structure, with the aid of computer systems.

theorized and developed by engineer Mutsuro Sasaki Japan and applied for the first time in the Cultural Center of Arata Isozaki kitag (images above), the Flux Structures from research facilities of fluid and organic, made in a rational way using the principles of evolution and self-organization.
Through these models, the computer is able to quickly generate the shape of the shell with curved surfaces free, guaranteeing maximum structural rationalization efforts and finally replacing the traditional experimental methods, yet conceptually similar to those used, for example, by Gaudi.
The traditional practice of requiring, in fact, an initial planning stage for subsequent attempts, during which the surface is expressed by the architect imagined using curvilinear features, and then proceed to the analysis of effort and strain with the classical methods.
Obviously a change in the organization with consequences on the structure of the plant normally shows the process at an early stage.
Once established, the algorithm can generate the structure, however, the computer can calculate and display numerous variations on in minutes, obtained by changing the design parameters: now you can select the most interesting of those obtained.
[1] See M. Sasaki, Flux Structure, ed. Toto, Tokyo, 2005.
or M. Sasaki, Flux Structures, in "Casabella" n. 752, February 2007, pp. 26-29.

Thursday, May 22, 2008

Buy Lp Record Player Turntables In Kolkata

Maybe I did ...

The other day I published the new version of my personal website ... and I must say that it satisfies me enough.
Let's talk about some sort of chest of drawers, one of those vintage, with 3 drawers that open when the user decides to click on and where you can find information, links to external sites, news ...
The thing that I think is a passionate site as if the user had in front of a sheet of paper, to browse, from running ...
I do not know if the experiment is successful or whether we are dealing with another "artifact" of the web, but I liked the experiment and I will repeat it.

sure the site does not look like a corporate website, nor has the elegance of a page technically impeccable: it's just a web page, hand-painted on which was pasted the action script code to make it work ... And Flash MX has helped me greatly ... As my scanner ...

Friday, May 9, 2008

Titleist Ap2 Difference

genetic algorithm (model)


Toyo Ito / Cecil Balmond, Serpentine Pavilion Gallery, 2002.


design related to the use of diagrams has always used a sequential method to set and solve specific problems, through a process that, under certain circumstances, leading to a unique solution ( linear algorithm).
The increasing complexity of the relationship between the many variables involved, coupled with the frequent indeterminacy functional design time and in the face of ever smaller, it requires an approach more open and flexible to maximize the contribution of information systems.
An interesting response to these requirements may be found in a diagrammatic procedure involving the breakdown of the project n components (functions, structures, systems ...), representing initial input. The latter are made, then, in relation to each other through an algorithm that allows the computer to develop a formalization instant architecture ( genetic algorithm) 1.
This is a heuristic method that seeks the optimal solution of the problem through the selection made in a number of possible solutions and immediately verified, resulting from even a small variation of 'input ( genetic mutation).
In this context, appears to be particularly interesting, in my view, the relationship between structural design and formal aspects of the building, referring to the possible topological changes based on algorithms of this type (see works by Toyo Ito, for example ).
The purpose of this research hypothesis is verified via examples of recent architecture, a possible "case approach" to the computer model through a diagrammatic procedure, the genetic algorithm, able to explain the "relationship between the parties" . 2

1 for the definition of genetic algorithm is referred to the special issue Diagrams of 'Lotus', No 127, 2006, p. 16. 2 See A.
Wise, Introduction to the Revolution in Architecture , par. 8.8 "Clouds or diagrams," p. 90.

Chapter chose: 8. model.
book in the series The IT Revolution: Ito digital. New media, new reality. Patricia Mello.
Paper on the theme: Flux Structure of Mutsuro Sasaki.
extradisciplinare Paper: Thousand Years of Nonlinear History. Rocks, germs and words. Manuel De Landa.


Corrections and clarifications required by the prof. Wise has already been inserted in the 10 rows of the working hypothesis, which are thus enlarged compared to the meeting of '5.8.
Regarding the need to narrow the discussion, which may be too large compared to the time constraints of the seminar, I believe we can resolve the problem by focusing on a few samples to be analyzed (eg on structural surfaces used in Some recent architectures of Ito).


Toyo Ito, Taichung Metropolitan Opera House, 2005.

Friday, April 25, 2008

How To Make Chocolate Ganache

work program

the synthesis of the work program which I sent to prof. Essay was published on the blog of the seminar.

http://antoninosaggio.blogspot.com/2008/04/ii-seminario-di-antonino-saggio.html

Tuesday, February 5, 2008

Jockstrap In Australian Rules Football

Web 2.0 and data redundancy

Reflecting on all those issues that rage these days on the net or in magazines, I thought of one thing, but what amount of data we are loading on line through this Web 2.0, how many GB and GB of memory used for uploading photos to Flickr or video on YouTube?
Sooner or later these areas will end, as eventually the space on your hard disk. Okay
memories that are increasingly large trade in cheap, but I think that this redundancy of information is leading us to gradually become accustomed to so much as "unclean" in the sense of "coming evil," not optimized, we will lose the having also an aesthetic sense that characterizes us as artists such as photographers or videographers.
Who tells me, a photo come bad that I keep the same memory, or a low-resolution movie that never look but I have saved on your PC, do not allow me to weed, let me lose sight of what you can give me a better product but by force of circumstances, present in smaller quantities?
People now browse, upload photos and then no one is looking, then open blog that does not use, send movies to glorify self ... but all this data, do we really need the development of Web 2.0 or is it just a way that most sites Clicks have to attract people and then advertising revenue?