Date | Speaker | Title | Further Information | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Sep. 5 | Dr Daniela Oelke (U. Konstanz) | Visual Document Analysis | Abstract: Large amounts of
data are only available in textual form. Due to the
semi-structured nature of text and the impressive flexibility and
complexity of natural language, the development of automatic
methods for text analysis is a challenging task. This talk will
present visual document analysis approaches that integrate the
user in the analysis process by means of data visualization
techniques. Bio: Daniela Oelke is currently a post-doctoral researcher in the Ubiquitous Knowledge Processing Lab at the DIPF in Frankfurt/Main, Germany. Before joining UKP-DIPF in July 2012, she was a member of the Data Analysis and Visualization Group of the University of Konstanz, Germany, where she also received her PhD degree in computer science in 2010. Her main research focus is on visual analytics techniques for document analysis / linguistic data analysis and visual analytics for educational research. | ||||||||
Sep. 12 | Dr Penny Duquenoy | Integrating ethics and technical development projects | With the increasing pervasiveness of information/communication technologies in all aspects of life has come an increasing awareness of the impact of those technologies on social/ethical values, which may result in (i) loss of public trust in technology (ii) loss of sales (ii) loss of reputation. These factors matter both to business and to research funding agencies. This presentation will give an overview of two funded projects that aimed in different ways to embed ethics into the development process. The first, an EU Science and Society project “EGAIS (Ethical governance of emerging technologies)”, as its name suggests, investigates and critiques the governance mechanisms used currently by the EU and suggests an alternative governance model to better integrate ethics into EU projects. The second project (EPSRC project ISIS: protecting children in online social networks”) included an ethics component from the beginning of the project to investigate ethical concerns relating to the technology to be developed. Both of these projects are about processes that enable the consideration of ethics in technology projects, and both are relevant to student projects, research projects, and IT practitioners. | ||||||||
Sep. 19 | Prof Balbir Barn | Enterprise Architecture Coherence and the Model Driven Enterprise: Is Simulation the answer or are we flying kites? | Aligning information communications technology (ICT) to business goals is a common issue cited by senior executives and recent research in measuring alignment provides evidence that those organizations that have aligned successfully their business and IT strategy will out perform those that have not. Enterprise Architecture (EA) aims to capture the essentials of a business, its IT and its evolution, and to support analysis of this information and is thus seen as an important tool for this alignment requirement. However, existing methods, techniques, languages and supporting technology for EA may not be sufficient for helping deliver this agenda and increasingly, simulation is perceived as one such solution. Simulation however presents other challenges, for example, the problem of validity versus credibility. This paper charts a philosophical route through a discussion on models and what they represent, the communication structures implicitly required for models to work and proposes that model based simulation of a model based enterprise can be more effective if there is a theoretical basis to a simulation model. This hypothesis is evaluated by a re-interpretation of Toulmin’s Argumentation model as a candidate for the underlying theory for constructing simulations of enterprise architecture coherence. | ||||||||
Sep. 26 | Dr Nawaz Khan | Microarray Datasets Using Combined Multiple Models to Predict Leukemia Types | Recent advances in microarray technology offer the ability to measure expression levels of thousands of genes simultaneously. Analysis of such data helps us identifying different clinical outcomes that are caused by expression of a few predictive genes. This research not only aims to select key predictive features for leukemia expression, but also demonstrates the rules that classify differentially expressed leukemia genes. The feature extraction and classification are carried out with combination of the high accuracy of ensemble based algorithms, andcomprehensibility of a single decision tree. These allow deriving exact rules by describing gene expression differences among significantly expressed genes in leukaemia. It is evident from this research that it is possible to achieve better accuracy in classifying leukaemia without sacrificing the level of comprehensibility. | ||||||||
Oct. 3 in C204 | Dr Barnaby Martin | Computational complexity and the boundary of tractability | We give a beginner's introduction to the field of Computational Complexity as a prelude to discussing recent obsessions with wide and discrete taxonomies. Specifically, we introduce and discuss the problem classes P, NP and Pspace (also Logspace and Exptime), before exhibiting wide classes of problems that display trichotomy between P, NP-complete and Pspace-complete. It is still unknown whether there is a natural problem sitting somewhere inbetween these classes. | ||||||||
Oct. 10 | Prof Orhan Gemikonakli | Current Trends in Telecommunications | The talk will be on “Current Trends in Telecommunications”. After a brief overview of the development of telecommunications from the days of smoke signals to date, current trends such as convergence of communications technologies together with their challenges, opportunities, and implications on research will be explored. Topics such as developments in Mobile Communications, next generation technologies, LTE, handoff strategies, multimedia communication, and cognitive radio will be covered. | ||||||||
Oct. 17 | Doctoral Students |
| |||||||||
Oct. 24 | Prof Mike Paterson (Warwick) | Overhang bounds | How far off the edge of the table can we reach by stacking n identical rectangular blocks? A classic solution achieving logarithmic overhang was widely believed to be optimal. We show, however, that this classic solution is exponentially far from optimal. | ||||||||
Oct. 31 | Prof Hemda Garelick | Arsenic pollution in water: analysis and solutions | Access to uncontaminated water in sufficient quantities may be the most important requirement for healthy human societies. However, the relationship between the water supply in developing countries and the health of citizens is complex since the relationship is dependent on the provision of both appropriate quantities and quality of water. The drive to provide microbiologically clean water has on occasions resulted in undetected chemical contaminations. One such classic case is of arsenic contamination in tubewells, installed by UNICEF in the 1980s in Bangladesh and West Bengal, where it is estimated that 40 million people were exposed to toxic levels of arsenic. This seminar will provide an overview analysis of the topic and will present studies carried out in the Department of Natural Sciences, studying some solutions, and related mechanisms, for arsenic remediation. | ||||||||
Nov. 7 | Dr Florian Kammueller | Security of Active Objects | Abstract: Programming in large networks of computers, like the Internet, poses new problems of safely implementing parallel activities, code distribution, and complex communication structures. This talk investigates information flow security for distributed active objects. We consider ASPfun, a calculus for functional active objects that communicate asynchronously. It is completely formalized and its properties proved in the interactive theorem prover Isabelle/HOL. We provide a model for information flow security for active objects and a formal type system for static security analysis. | ||||||||
Nov. 14 | Prof Richard Bornat | Making concurrency work properly | Modern processors reorder instruction
executions and don't maintain a consistent view of memory across
a cluster of machines. This makes shared-variable concurrent
programming hard, because to force communication to happen in the
right order you have to insert barrier instructions. Barriers are
very expensive and only a very few people know how to place them
and even they make mistakes. I've found a program logic that lets
me place them accurately. The talk will be an overview of the problem and a sketch of the logic. It works for Power and ARM processors. There may be machine code. | ||||||||
Nov. 21 | Dr Cong Ling (Imperial) | Structured Compressed Sensing: An Unexpected Journey into the World of Golay Sequences | The benchmark scheme of compressed sensing
uses random Gaussian/Bernoulli sensing matrices. Although
provably achieving sub-Nyquist sampling rates, such sensing
matrices are unfortunately difficult to implement in
practice. Recent advances in structured compressed sensing show
potential in reduced-complexity implementation. The classical
Golay sequences, originally proposed by Golay in 1961,
unexpectedly resurface in several constructions of structured
sensing matrices. In this talk, I will give an introduction to
Golay’s complementary sequences, and show why Golay sequences
are useful in compressed sensing. In particular, I will present
Golay-OSTM (orthogonal symmetric Toeplitz matrices),
Golay-Hadamard matrices, Golay-Fourier matrices, and Golay-MWC
(modulated wideband converter) etc., which enjoy theoretic
guarantees and fast recovery. Bio: Cong Ling received the B.S. and M.S. degrees in electrical engineering from the Nanjing Institute of Communications Engineering, Nanjing, China, in 1995 and 1997, respectively, and the Ph.D. degree in electrical engineering from the Nanyang Technological University, Singapore, in 2005. He is currently a Senior Lecturer in the Electrical and Electronic Engineering Department at Imperial College London. His research interests are coding, lattices, network information theory, and signal processing. Before joining Imperial College, he had been on the faculties of Nanjing Institute of Communications Engineering and King's College. Dr. Ling is an Associate Editor in Multiterminal Communications and Lattice Coding for IEEE Transactions on Communications. He has also served as an Associate Editor for IEEE Transactions on Vehicular Technology and on the program committees of several international conferences including IEEE Information Theory Workshop, Globecom, and ICC. | ||||||||
Nov. 28 | Prof Edmund Penning-Rowsell | Distributional Aspects and the Developing Practice of Assessing the Real Impact of Flooding in the UK Today | Flooding is in the news today, and in the Flood Hazard Research Centre Middlesex University has one of the premier research capabilities in the UK. It was founded in 1970 and has pioneered the social and economic aspects of natural hazards, focusing on flooding, and has produced over the last 42 years many technical manuals and scores of research papers in this field. In the year 2000 the Centre was awarded the Queens Anniversary Prize for Further and Higher Education, and it's head then was awarded the OBE in 2006. The research currently being pursued is on social equity and flood hazard management, efficiency of budgetary processes, risk communication through maps and social media, and governance aspects of flood hazard management. Prof Edmund Penning-Rowsell will give a seminar on elements of the Centre's work over the last decade, focusing on distributional aspects and the developing practice of assessing the real impact of flooding in the UK today. | ||||||||
Dec. 5 | Gary Hearne and Dr Kristian Sund | A Structural Model of the Interpretation Process And Perceived Uncertainty in a Specific Issue | Sensemaking in general and the interpretation of environmental cues specifically, are carried out under conditions of uncertainty with which managers and organizations must cope. Mounting evidence supports the existence of organizational processes of issue interpretation, but empirical evidence is often conflicting in its conclusions about specific relationships within these processes and about the effects of perceived uncertainty on these. This paper reports the results of a survey of how hotel executives interpret a specific environmental trend. We use a combination of traditional factor analysis and structural equation modelling to test hypotheses on the relationships between perceived environmental uncertainty and the interpretation process. | ||||||||
Dec. 12 | Carl James-Reynolds | A Christmas Stocking | A miscellany of items and toys are pulled from the stocking, to include the odd student project, observations on children and social networks, a modular synth, some data about student phone use, interactive musical rooms, A breadboard to beat your head against, A roll of gaffer tape, possibly a cracker and an Orange or Apple. Like all stockings, some of the items will be discarded and lost under the sofa, some will be used up and some may even last until next year. Party hats are optional. | ||||||||
Dec. 19 | Christmas | ||||||||||
Dec. 26 | Christmas | ||||||||||
Jan. 2 | Christmas | ||||||||||
Jan. 9 | Dr Elke Duncker-Gassen | The implementation of the national researcher development framework in the school of S&T. | Explanation: The researcher development framework (RDF) is a national framework, which all universities and school have to implement. The QAA have adopted it and will assess the academic provisions for researchers and research students on that basis. Associate Dean Prof. Balbir Barn, Dr Geetha Abeysinghe, Dr Rui Loureiro, Dr Florian Kammueller and Dr Elke Duncker will present the framework itself, its implementation in our school and the subsequent list of seminars and other classes for research students and researchers. | ||||||||
Jan. 16 | Prof Andreas Albrecht | Biomolecular Structure Prediction and MicroRNA Bindings | Abstract:
MicroRNAs are small non-coding RNAs that play an important role in
the regulation of protein expression in animals and plants. The
length of microRNAs is about 21nt, and through binding to
predominantly the 3'UTR of messenger RNAs (mRNAs, translated into
proteins) they inhibit translation or induce cleavage of
mRNAs. Although the first microRNA, lin-4, was discovered already
in 1993, the existence of a whole new class of non-coding RNAs
(not translated into proteins) became apparent only several years
later, and the term microRNA eventually emerged in 2001. Since the experimental verification of true microRNA-mRNA bindings is time-consuming and expensive, computational tools for the prediction of target mRNAs of a given microRNA are essential in Molecular Biology. The presentation will review the most prominent microRNA target prediction tools that have been developed over the past decade and highlight their main features. Messenger RNAs are present in the cell in a folded state. Most advanced microRNA target prediction tools incorporate information about the folded state in the form of secondary structures (relaxation of 3D structures; 3D structure prediction for RNAs is NP-complete). In the context of microRNA target prediction, results from our research on approximation algorithms for RNA energy landscapes induced by secondary structures will be presented. | ||||||||
Jan. 23 | Prof William Wong | Applying Cognitive Systems Engineering to intelligence analysis | ABSTRACT: In this talk, I will discuss
how principles from Cognitive Systems Engineering might be used to
design Visual Analytics systems to support intelligence
analysts. In designing systems to control processes such as
nuclear power generation, CSE has been used to determine and model
a priori the functional relationships that relate the performance
of the processes with system outcomes. Visual forms are then
created to represent these invariant relationships in ecological
interface designs. Can cognitive systems engineering be applied to
the domain of intelligence analysis? And if yes, how might this
be? And how should CSE principles be applied to the design of
visual representations in intelligence analysis to take advantage
of the benefits we have seen when CSE is applied to causal
systems?
ABOUT THE SPEAKER: Dr William Wong is Professor of Human-Computer Interaction and Head, Interaction Design Centre, at Middlesex University's School of Engineering and Information Sciences, London, UK. His research interest is in the representation design of information to support decision making in naturalistic environments. Recipient of over US$7.1 million in grants, and project coordinator for several projects, he is currently investigating the problems of visual analytics in sense-making domains with high information density and variability, in contexts such as intelligence analysis, financial systemic risk analysis, and low literacy users. | ||||||||
Jan. 30 | Dr Jane Barnett | Startle reaction: Capturing experiential cues to provide guidelines towards the design of realistic training scenarios | Unexpected events experienced during a crisis situation, may startle an individual causing them to momentarily freeze, and become unsure of what to do next. This research identifies event cues and subsequent responses to them. Ten emergency personnel were interviewed about their experiences with unexpected events, where they had experienced a startle reaction. We used a new methodology (CUUES) to investigate cognitive and behavioural processes used to adapt and respond to them. Results suggest that regardless of experiencing of experiencing startle, participants were able to gain control of the situation. Our results suggest initial guidelines for the development of realistic training scenarios. Evaluating the effect of startling and surprising events in immersive training systems for emergency response In emergency situations, emergency service personnel are often confronted with unexpected events that are difficult to manage. The aim of this study is to identify and understand the impact of complex and startling cues generated by these events to contribute to the development of realistic virtual-world training simulations. Two factors were explored for this purpose: the complexity of the action relevance check and the intensity of the unexpected event, which were varied across four experimental conditions of a simulated emergency reaction time task. Results showed that startling participants did not interrupt their on-going task, but that increasing thecomplexity of the task did. From this, we propose that unexpected events in training simulations should additionally expose trainees to complex and realistic situations, rather than simply startling them with sudden audio/visual stimuli. | ||||||||
Feb. 6 | Prof Iain Stewart (Durham) | Some recent developments in interconnection networks | Interconnection networks play a fundamental role in Computer Science. They are the means by which the processors of a distributed-memory multiprocessor computer communicate and also by which I/O devices communicate with processors and memory; they are increasingly used in network switches and routers to replace buses and crossbars; and they are crucial to on-chip networks. Their usage is becoming more prevalent and on an increasingly grand scale, as we manage to incorporate more and more 'nodes' within the confines of a single space. For example, the newest generation of high-end supercomputer networks is beginning to use high(er) dimensional tori: the IBM Blue Gene/Q is based around a 5-dimensional torus; and the K computer is based around a 6- dimensional torus. The extended applications of interconnection networks allied with technological advances promotes (the currently theoretical) investigation of new interconnection networks with a view to ensuring properties beneficial to the domain of application. In this talk, we'll look at some basic topological issues relating to (some popular) interconnection networks before moving on to look at relatively more recent hierarchical interconnection networks, some of which have been inspired by the use of free-space optical interconnect technologies in combination with electronic technologies. The talk will be suitable for a general audience and theoretical in nature. | ||||||||
Feb. 13 | Dr Andrew Tizzard | Bioimpedance Imaging: The Shape of Things to Come? | Electrical Impedance Tomography (EIT) is an imaging technique based on multiple bio impedance measurements to produce a map (image) of impedance or changes in impedance across a region. The mathematical principle behind producing the images is based on solving an inverse problem and this requires a forward model to evaluate the reconstruction. The most commonly used approach to generating a forward model is to produce a finite element mesh of the anatomical target structure. Historically, these models have been approximations of the anatomy based on spheres (head), cylinders (thorax, neck etc.) and mixed geometric primitives for more complex structures such as the breast. This seminar presents the work of the research group at Middlesex over the last 12 years that has focused on creating more geometrically accurate forward models for electrical impedance tomography and how these have improved image quality and localization of impedance changes. The way forward for rapid generation of patient specific geometry is also discussed and what implications this may have on potential clinical applications. | ||||||||
Feb. 20 | Doctoral Students |
| |||||||||
Feb. 27 | Prof Stephen Dilworth | Viruses and Cancer: Good Guys or Villains? | Early investigations into
the molecular changes that cause cells to become cancerous relied
heavily on viral models. Two groups of viruses, the retroviruses
and the small DNA tumour viruses, provided these models as they
could alter cells in culture, cause cancers in animals, and
importantly, could be manipulated for genetic analysis. As a
consequence, most of our understanding of how cancers form came
from these studies, together with the techniques required to
study these changes. Without these viruses, it would have taken
us much longer to develop the rational therapy approaches that
are currently revolutionising cancer treatments.
The viruses studied for early molecular cancer research were animal derived, however, and do not normally initiate cancers in vivo. For many years it was believed that human viruses did not have a role in cancer formation, but over the last 25 years or so, it has become clear that this is not the case. Up to 95% of human cervical carcinomas have HPV as a major contributory factor, and all young females in the UK are now being offered a vaccination to protect against this. Other viruses, such as many of the Herpes and polyoma families, are now known to be involved in causing cancers, and most virologists accept that this is only the start. This has its good and bad side, we can vaccinate against them, so preventing cancers, but we can catch them, making cancer, at least partly, an infectious disease! The situation is now beginning to turn full circle, however. As a consequence of our close study of the cancer causing viruses we now realise they can also be used to cure cancers. There is intensive research being performed at the moment in ways to create viruses that specifically infect and kill cancer cells, so called oncolytic viruses. In addition, viruses are being used as gene therapy agents to carry corrective genes, such as tumour suppressors, into cancer cells to correct their defects. In the last 35 years, then, viruses have been seen as good guys in our search for cures for cancer, then were seen as possible causes of the cancers, and now may provide therapy agents. Some of the history of these viral studies will be presented, together with our current work using small DNA viruses to identify novel therapy targets that may provide new anti-cancer drugs. | ||||||||
March 6 | Dr Aboubaker Lasebae | Mobile Femtocells (relays) in LTE Network | In order to meet the needs of high speed wireless connections, i.e., Long Term Evolution (LTE) has been standardised and deployed by the 3rd Generation Partnership Project (3GPP). Like pervious cellular networks, the capacity of the LTE network is not evenly distributed especially the cell edge users have much worse throughput than cell-centerd users. However, LTE-Advanced, addressed to improve the system capacity and improve the cell edge user experiences especially by introducing Femtocells technology. Femtocells are low power eHome NodeBs (eHNB) that provide enhanced coverage and capacity at cell edges. One of the main benefits of relaying is to provide extended LTE coverage in targeted areas at low cost. The femtocells are connected to Base Station via radio interface and its radio resources are shared among the cell edge mobile users. In this seminar, we are exploring the possibility of mobile deployment of such femtocells node known as "moving" relay node (MRN) on current cellular network. MRN is simply the relay node but it is a moving node instead of at a fixed location. MRN is envisaged as a relay node carried by a moving object ideally a vehicle. Deployment of MRN may potentially lower the end-to-end outage probability at the user node compared to serving the user node by direct or fixed RN (FRN) assisted transmission. | ||||||||
March 13 | Dr Glenford Mapp | Exploring the Effect of Maximum Velocity Transportation Models on Handovers in Heterogeneous Environments using Exit Times | Mobile nodes now have several interfaces and connections can be seamlessly switched among available networks using vertical handover techniques. The amount of time that a mobile node spends in a wireless network is called the Network Dwell Time or NDT. The Exit Time or ET is the amount of time that a mobile node has in a network before it must begin to handover to another network. ET is dependent on the velocity of the mobile node and the time taken to handover. Thus if the Exit Time is less than or equal to zero, handover cannot occur because the mobile node does not have enough time to acquire the network resources to do the handover to the next network. In addition, most transportation systems use the concept of maximum velocity or speed limit restrictions to restrict the velocity of vehicles. Using this approach, this talk explores how the velocity of users can affect handover decisions. Handovers between 3G and WLAN networks are examined. |