Sistemi di work-flow temporali

When : Thursday, November 30, 2006 - 15:00
Speaker : Prof. Carlo Combi
Affiliation : Universit� di Verona
Where : Aula Magna `A. Lepschy`
Description :

I workflow descrivono processi aziendali che consistono nell`esecuzione coordinata di attivit� atomiche (task) da parte di esecutori umani o di dispositivi di vario genere. I sistemi di gestione di workflow (WfMS) sono sistemi software che supportano l`esecuzione di workflow. Differenti aspetti temporali vanno gestiti da parte dei WfMS: il controllo di vincoli sulle durate dei singoli task, sui ritardi ammessi, la disponibilit� nel tempo degli agenti a cui assegnare l`esecuzione dei task, il cambiamento degli schemi di workflow. Questi sono solo alcuni esempi delle temporalit� che interessano l`ambito dei sistemi di workflow. La maggior parte dei WfMS, inoltre, si basa su sistemi di basi di dati, dove gli aspetti temporali insiti sia nella specifica sia nella gestione dei workflow vanno gestiti esplicitamente applicazione per applicazione.
In questo seminario, dopo una breve introduzione dei concetti di base relativi ai sistemi di workflow, considerer� alcuni aspetti inerenti la gestione di informazioni temporali nei sistemi di workflow: in particolare considerer� l`uso di un sistema di basi di dati temporali a supporto dei principali moduli di cui si compongono i sistemi di workflow. Ad esempio, un sistema di basi di dati temporali pu� facilmente gestire interrogazioni in grado di rintracciare lo schema di un workflow che varia nel tempo oppure di bilanciare nel tempo il carico di lavoro che deve essere assegnato ad ogni agente. Saranno inoltre discusse differenti architetture per sistemi di workflow temporali.
Seminario ospitato nell`ambito del corso di `Intelligenza Artificiale` (Prof.ssa S. Badaloni), Corso di Laurea in Ingegneria Informatica.


Un approccio cognitivo all`apprendimento per imitazione �

When : Thursday, November 16, 2006 - 15:00
Speaker : Dr. Haris Dindo
Affiliation : Universita` degli Studi di Palermo
Where : Aula Magna `A. Lepschy`
Description :

Lo scopo dell�apprendimento per imitazione nei sistemi robotici � quello di ridurre la complessit� della programmazione dei robot mostrando loro come eseguire determinati compiti. Attraverso l�imitazione il robot pu� apprendere dall�ambiente che lo circonda ed adattarsi ad esso proprio come succede durante lo sviluppo di un bambino.
Durante il seminario verr� presentata l�architettura ConSCIS (Conceptual Space based Cognitive Imitation System) ispirata alle recenti scoperte nelle scienze cognitive, biologia e neuroscienze. L�architettura integra diversi aspetti dell�imitazione, dalla percezione visiva e controllo motorio alla rappresentazione della conoscenza e ragionamento, ed � stata validata da numerosi esperimenti effettuati su una piattaforma robotica antropomorfa.
Seminario ospitato nell`ambito del corso di `Intelligenza Artificiale` (Prof.ssa S. Badaloni), Corso di Laurea in Ingegneria Informatica.


Vision-based Robotic Grasping Inspired by Neuroscience

When : Thursday, October 26, 2006 - 15:00
Speaker : Dr. Eris Chinellato
Affiliation : Universitat Jaume I, Castell�n, Spain
Where : Aula Magna `A. Lepschy`
Description :

The integration between sensory, associative and motor cortex of the human brain in vision-based grasping actions is obtained by coordinating the two visual streams of the human cortex, the action-oriented dorsal stream and the perception-oriented ventral stream. In this talk, we will compare present day research on vision-based robotic grasping with the above neuroscience findings, and outline a model of vision-based grasp planning that builds strongly upon primate, and especially human physiology, trying to emulate the dualism and the interaction between the two streams. The model has been conceived to be applied on a robotic setup, and the design of the different modules inspired on brain areas has been performed taking into account not only biological plausibility, but also practical issues related to engineering constraints.


L`uso della voce eufonica negli insegnanti

When : Thursday, June 22, 2006 - 15:00
Speaker : R. De Santi
Affiliation : Azienda Ospedaliera di Padova
Where : Aula Magna `A. Lepschy`
Description :

Definiamo� professionisti della voce coloro i quali devono usare lo strumento vocale in maniera� prolungata e stressante per motivi di lavoro.��Per queste figure professionali, la voce diventa quindi un bene prezioso e indispensabile per poter svolgere al meglio la propria attivit�.
Gli insegnanti, prima di affrontare il lungo percorso dell`insegnamente, dovrebbero acquisire attraverso un counseling adeguato tutte quelle nozioni necessarie�ad evitare�alterazioni laringee organiche e funzionali, che sono le conseguenze di un cattivo uso della voce durante la loro attivit�.
Il seminario prevede�di fornire dapprima alcuni cenni sull`anatomia e la fisiologia degli apparati pneumo-fono-articolatori, per proseguire successivamente con�una parte pi� applicativa, in cui verranno illustrate alcune regole di igiene vocale utili a ridurre il fonotruma che frequentemente questi soggetti producono nell`usare la voce in modo professionale. In questo modo si dovrebbe ridurre l`aspetto funzionale errato e gli insegnanti potrebbero trarre significativi benefici per la loro voce quale strumento base di lavoro.


TCAD and DFM in the sub-100nm era

When : Thursday, June 15, 2006 - 16:30
Speaker : L. Sponton
Affiliation : Swiss Federal Institute of Technology (ETH), Zurigo (CH)
Where : Aula Magna `A. Lepschy`
Description :

In advanced CMOS VLSI technologies the critical dimensions are well below 100nm. For such technologies, the use of smart Design for Manufacturing techniques and corresponding technology optimization, which one can call `Manufacturing for Design`, is becoming mandatory in order to achieve
reasonable yields. Variations in the main transistor characteristics are coming from different sources in random and systematic process variability. 3D process-effects, subwavelength-lithography and stress effects are the most important.
In this scenario TCAD is an invaluable tool for understanding the physics behind the various phenomena and to quantitatively characterize the expected impact of each effect and their interactions. Using TCAD, it is possible to study by means of 3D simulations the behavior of small devices, the impact of stress effects and of non-uniform gate shapes on the final characteristics of a transistor. In this talk an approach with full 3D process-device simulations and its influence on the development of a TCAD-to-circuit simulation flow will be presented.


Flexible Slow and Fast Light in Optical Fibres

When : Thursday, June 15, 2006 - 15:00
Speaker : L. Thevenaz
Affiliation : EPFL Swiss Federal Institute of Technology, Losanna (CH)
Where : Aula Magna `A. Lepschy`
Description :

A method to achieve an extremely wide and flexible external control of the group velocity of signals as they propagate along an optical fibre will be presented. This control is achieved by means of the gain and loss mechanisms of stimulated Brillouin scattering in the fibre itself. Our experiments show that group velocities below 71,000 km/s on one hand, well exceeding the speed of light in vacuum on the other hand and even negative group velocities can readily be obtained with a simple benchtop experimental setup.
The experimental result shows tuneable optical delay as much as 152 ns with a 40 ns pulse. Stimulated Brillouin scattering offers the key advantage of the generation of synthesized gain spectra, so that innovative slow light schemes can be realized, ranging from broadband tunable delays to a zero-gain situation identical to an ideal electromagnetically-induced transparency.


Computational mass spectrometry for protein identification

When : Thursday, June 8, 2006 - 15:00
Speaker : C. Garutti
Affiliation : DEI
Where : Aula Magna `A. Lepschy`
Description :

Mass spectrometry (MS) is an analytical instrument used to determine the mass-to-charge ratio of ions. In the last decade, MS emerged as a powerful method for the analysis of biological molecules, in order to obtain the chemical composition of a compound and the amount of a compound in a sample. An important field of application for this technique is proteomics, where protein structure determination and biomarkers detecting are two major issues. Tandem mass spectrometry (MS/MS) involves multiple step of mass selection or analysis, after a fragmentation of a specific peptide ion, and permits to elucidate the composition of complex molecules, which cannot be determined by means of MS alone. This presentation will introduce MS and MS/MS, and then focus on the computational problems related to the interpretation of their results, such as dimensionality reduction, peptide identification via database search and de novo peptide sequencing.


Security and IP-Based 3G Wireless Networks

When : Thursday, May 18, 2006 - 16:30
Speaker : T. La Porta
Affiliation : The Pennsylvania State University, USA
Where : Aula Ke
Description :

Telecommunication networks are evolving from closed systems with limited, standardized services, to open systems which will allow great creativity in building and deploying new services.ᅠ These systems will heavily leverage Internet technology in an effort to create this open environment.ᅠ This evolution is being aggressively pursued by Wireless Service Providers (WSPs). Along with the benefits of these networks come increasingly high risks of a variety of attacks that may compromise security. Current, so called second generation (2G) wireless telecommunication networks are implemented using standardized control protocols for user and device authentication, mobility management, session control and services control. These networks are closed in the sense that control messages are exchanged on a private packet-switched network based on the Signaling System No. 7 standards.ᅠBecause of their closed nature, there are few successful attacks on these networks. The next, so called third generation (3G) wireless telecommunication networks are migrating towards IP technology, with the ultimate goal being an all-IP network.ᅠ Standards for these systems, called the IP Multimedia Subsystem (IMS) are being defined by the Third Generation Partnership Projects (3GPP and 3GPP2).ᅠ These networks will use IP for transport of information, and Internet protocols such as the Session Initiation Protocol (SIP) and Mobile IP, for session control and mobility management.ᅠ These networks open the possibility for IP-based services and must interwork with 2G networks. Because new services will be introduced in the IP-domain of these networks, new attacks on 3G networks are possible.ᅠ Because IP networks are more accessible than SS7 networks, the control portion of the 3G networks is now more vulnerable to attack.ᅠ These attacks may be remote denial of service attacks, or attacks that target the integrity of specific services.ᅠ The means of the attack may vary depending on the interworking model used and the service being offered.ᅠ In this talk we discuss the different security risks in IP-based 3G networks, different attack types, and the trade-offs of high performance, open network architectures versus secure network infrastructure


Hi-performance computational grid: a case study for Bioinformatics

When : Thursday, May 11, 2006 - 15:00
Speaker : G. Ciriello
Affiliation : DEI
Where : Aula Magna `A. Lepschy`
Description :

The emerging Grid technology is becoming an important aspect for the solution of both computer intensive and data intensive problems. The computational grids enables the use of a large number of different machines acting as a single one, by sharing both storage capacity and computing power. The importance of sharing data and resources in a secure manner is proved by the increasing interest of scientist towards this technology, especially in the biomedical community, that includes bioinformatics. In particular in proteomics a relevant area deals with protein structures comparison. Comparing protein structures is important for protein classification and for understanding the protein functions. We have developed a method, PROuST,ᅠ that allows efficient retrieval of similarity information from a database containing all protein structures of the Protein Data Bank (PDB).
In this talk I will first present the general Grid architecture and the method PROuST for protein structures comparison. Then I will focus on optimization strategies adopted to port PROuST onto a real grid, and on PROuST performances on it. More in details PROuST consists of different components: an index-based search that produces a list of proteins that are good candidates for similarity, and a dynamic programming algorithm that aligns the target protein with each candidate protein. Since both components use the same geometric data stored in large hash tables an important issue arises when porting the application on a Grid, i.e. the tradeoff between data transfer and data recomputation. Replica optimization also is a crucial aspect of a gridifying strategy. Using the pool of services provided by the European DataGrid we experimented with two main policies for replica management: On-line Replica and Off-line Replica. In the last part of my talk I will present the results about the efficiency measurements and reliability.


Nuove sorgenti coerenti nell`IR per spettroscopia ad altissima sensibilit�

When : Tuesday, April 4, 2006 - 12:00
Speaker : Paolo De Natale
Affiliation : Istituto di Ottica Applicata - CNR, Firenze
Where : Aula Magna `A. Lepschy`
Description :

La spettroscopia ad altissima sensibilit� e precisione ha fatto negli ultimi anni grandi progressi, soprattutto grazie ad importanti innovazioni introdotte nel settore delle sorgenti coerenti. Indicativo di tali progressi � ad esempio l�attribuzione dei Premi Nobel per la�Fisica 2005 che vedono, tra le motivazioni, lo sviluppo del generatore di pettini di frequenza nel visibile con impulsi ultracorti (OFCG), che apre la strada alla spettroscopia ad altissima precisione nel visibile-vicino IR direttamente riferita al campione primario di frequenza senza l�uso di �catene� di laser agganciati in fase.
Come verr� illustrato nel corso del seminario, gli OFCG possono essere� di grande utilit� anche per rivelazione ad alta sensibilit�, grazie alla elevata stabilit� e riproducibilit� di misura che essi consentono,�soprattutto in combinazione con sorgenti di radiazione sintetizzata con tecniche non-lineari, quali la generazione di frequenze differenza. Tali sorgenti hanno l�ulteriore vantaggio di operare nella regione infrarossa, dove gli assorbimenti molecolari sono pi� forti, consentendo quindi sensibilit� di rivelazione molto elevate. Sempre nell�infrarosso opera un�altra classe di sorgenti innovative, i laser a �cascata quantica�, che hanno consentito alla spettroscopia di gas in tracce di compiere un significativo progresso. Risultati spettroscopici per entrambi i tipi di sorgenti verranno descritti, facenti uso di varie configurazioni sperimentali.


Approach to Data Mining from an Algorithms Perspective

When : Thursday, March 16, 2006 - 15:00
Speaker : Takeaki Uno
Affiliation : National Institute of Informatics, Tokyo (Japan)
Where : Sala riunioni DEI/G (sede storica)
Description :

Recently, data mining has been actively researched and widely applied to real world problems. There are many kinds of problems in data mining and frequent itemset mining is a central task for several of these problems.ᅠ In this talk, we approach frequent itemset mining from an algorithm theory perspective.ᅠ In this sense, we focus on two topics.ᅠ One topic is modeling the frequent pattern mining problem.ᅠ Weᅠ introduce the notion of `closed patterns`, and show some interesting properties. Then, we present frequent pattern mining algorithms identifying their bottlenecks and looking at recent implementations.


Low maintenance verification

When : Thursday, March 2, 2006 - 15:00
Speaker : Valeria Bertacco
Affiliation : University of Michigan, USA
Where : Aula Magna `Antonio Lepschy`
Description :

The verification of modern computing systems has grown to dominate the cost of system design, often with limited success, as designs continue to be released with latent bugs. This trend is accelerated with the advent of highly integrated system-on-a-chip (SoC) designs, which feature multiple complexᅠ subcomponents connected by simultaneously active interfaces. To cope with this challenge, logic simulation techniques are predominant in the industry, however, the coverage of the tests generated is usually low, with the result that even months of simulation provide little confidence in the correctness of
the design and, when design errors are found, the error analysis phase is daunted by long and complex bug traces. Additionally, these traditional techniques require a lot of effort from the engineering team to direct the verification activity towards specific design areas of critical quality. Nonetheless, they have such high inertia in existing development processes that the cost to transition to alternative methodologies, such as formal techniques, is high.
In this talk, I will introduce a new generation of hybrid verification solutions, which we call LOW MAINTENANCE VERIFICATION, where the contribution of formal techniques is transparently deployed within a simulation-based verification framework. Our use of formal techniques in this context greatly enhances the level of automation of the verification process, by generating solutions that can focus on a verification goal with minimal guidance from the engineer. I will overview some of the solutions that we developed in this space: Guido and StressTest, two solutions that can automatically generate `interesting` verification scenarios. Our preliminary experience in the domain of low maintenance verification indicates that this family of techniques can effectively lead to high-performance, high-coverage verification solutions, by generating concise error traces with minimal demands on verification engineers and no change in the verification process.
Finally, I will briefly provide some highlights from recent architecture and CAD research at the EECS department of the University of Michigan.


Recenti progressi nello studio delle onde coniche e della localizzazione spaziotemporale della luce in sistemi nonlineari

When : Tuesday, February 21, 2006 - 15:00
Speaker : Paolo Di Trapani
Affiliation : University of Insubria at Como
Where : Aula Magna `Antonio Lepschy`
Description :

Nonlinear conical waves are peculiar wave packets, featuring a `hot`, localized core that travels locked to a `cold`, extended, energy reservoir. Recent experiments in nonlinear optics have shown them to support stationarity and localization over long-distance propagation in bulk-transparent media, even in the presence of strong interaction with matter and nonlinear dissipation. This unique property of conical waves opens new perspectives for those applications in optical technology that would benefit from the availability of genuine `particle-like` waves. Relevant examples are: laser micro-machining, optical-waveguide writing, deep-field nonlinear microscopy, photolithography, plasma-filament generation for lightening control, frequency conversion, etc. As regards fundamental research, nonlinear dissipation has been found to accompany multidimensional wave localization in virtually all physical systems investigated. Thus, the conical-wave concept represents a valid alternative to that of solitons to achieve a unified description of instabilities and collapse in many extended systems, ranging from Optics to Acoustics, Bose-Einstein Condensates and Quantum Physics.