Past CNS Talks
Spatial Narratives and Deep Maps: Explorations in the Spatial Humanities
Abstract: The spatial turn in the humanities has been heavily premised on the use of GIS and geospatial technologies in project-based applications. This focus on GIS has come to the humanities after having made much earlier and successful inroads into the sciences and social sciences, not least because its algorithmic and positivist scientific architecture would initially appear to be at odds with the predominantly text-based and qualitative world of the humanities. Yet the humanities, far from being the recipients of a colonizing technology, have the potential to assimilate, shape, and refashion the technology to suit the somewhat unique characteristics of its own methodological traditions. This presentation explores the assumptions inherent in the adoption of a spatial scientific methodology and proposes ways in which the broader science of geographic information may be appropriately harnessed in the spatial humanities.
Specifically, the presentation will explore the use of spatial narratives and deep maps to accommodate the demands of humanists while taking full advantage of the power of GIS and Web 2.0 technologies. To illustrate the potential of this approach, it will focus on the NEH-sponsored Digital Atlas of American Religion (DAAR), a project of the US-based Virtual Center for Spatial Humanities, a collaboration among West Virginia University, Florida State University, and Indiana University Purdue University, Indianapolis. By using multi-level modeling, complex visualizations, and exploratory spatial data analysis (ESDA), the DAAR will illustrate the potential of constructing deep maps to engage the narrative traditions favored by humanists.
Bio: David J. Bodenhamer (PhD, Indiana University, 1977). Executive Director, the Polis Center, Professor of History, and Adjunct Professor of Informatics, Indiana University Purdue University, Indianapolis. Bodenhamer established The Polis Center in 1989 as a multidisciplinary unit dedicated to using collaboration, applied research, and knowledge of advanced spatial technologies to provide reliable information, thoughtful perspective, and creative solutions for the improvement of communities in Indiana and beyond. Since its establishment, the center has completed over 500 projects and has received over $65 million in grant and contract awards. With Trevor Harris and John Corrigan, Bodenhamer created the Virtual Center for Spatial Humanities, a collaboration among West Virginia University, Florida State University and IUPUI to promote the use of spatial theory and spatial technologies in the humanities. He also has developed international partnerships in Europe and Asia to advance this rapidly growing field. In addition to his publications in spatial humanities, Bodenhamer is editor of the International Journal of Humanities and Arts Computing, the editor-in-chief of the Encyclopedia of Indianapolis and author or editor of seven books in U.S. legal and constitutional history, including The Revolutionary Constitution, scheduled for publication by Oxford University Press in fall 2011.
Abstract: We discuss general theory behind deterministic annealing and illustrate with applications to mixture models (including GTM and PLSI), clustering and dimension reduction. We cover cases where the analyzed space has a metric and cases where it does not. See also
—Ken Rose, Deterministic Annealing for Clustering, Compression, Classification, Regression, and Related Optimization Problems. Proceedings of the IEEE, 1998. 86: p. 2210--2239.
—T Hofmann, JM Buhmann, “Pairwise data clustering by deterministic annealing”, IEEE Transactions on Pattern Analysis and Machine Intelligence 19, pp1-13 1997.
—Hansjörg Klock and Joachim M. Buhmann, “Data visualization by multidimensional scaling: a deterministic annealing approach”, Pattern Recognition, Volume 33, Issue 4, April 2000, Pages 651-669.
Bio: Geoffrey Charles Fox received a Ph.D. in Theoretical Physics from Cambridge University and is now distinguished professor of Informatics and Computing, and Physics at Indiana University where he is director of the Digital Science Center and Associate Dean for Research and Graduate Studies at the School of Informatics and Computing. He previously held positions at Caltech, Syracuse University and Florida State University. He has supervised the PhD of 62 students and published over 600 papers in physics and computer science. He currently works in applying computer science to Bioinformatics, Defense, Earthquake and Ice-sheet Science, Particle Physics and Chemical Informatics. He is principal investigator of FutureGrid – a new facility to enable development of new approaches to computing. He is involved in several projects to enhance the capabilities of Minority Serving Institutions.
Activating family networks: Using family health history information to promote health in Mexican origin families
Abstract: The current project aimed to identify intervention components that activate family network systems to exchange social resources among network members and how these new resource exchanges influenced health behavior. Specifically, Project Risk Assessment for Mexican Americans (RAMA) investigated the impact of Family health history (FHH)-based risk feedback on the risk communication and screening encouragement pathways in families of Mexican origin. All 465 participants from 161 households received a FHH pedigree. Households were randomized to one of four feedback conditions defined by two factors: 1) all or one participating household member received supplemental, personalized FHH-based risk assessments and 2) whether or not behavioral recommendations accompanied these personalized risk assessments. Personalized risk assessments and behavioral recommendations for heart disease and diabetes were generated using the CDC's Family Healthware. Outcomes included enumerated family members with whom participants shared feedback and discussed family risk of heart disease and diabetes at 3-month follow-up and from whom they received encouragement to engage in risk reducing behaviors at 10-month follow-up. Participants from households in which all members received supplemental RAs were more likely to initiate new communication pathways regarding family risk of heart disease, but not diabetes, at 3-month follow-up. At 10-month follow-up, participants from households in which everyone received a RA and behavioral recommendations were more likely to enumerate new encouragers of blood pressure and blood glucose testing. With respect to encouragement of lifestyle factors, participants in households in which all members received supplemental RAs were more likely to enumerate new encouragers of increased fruit and vegetable consumption at 10-month follow-up, while provision of behavioral recommendations improved encouragement for maintaining a healthy weight. Results suggest that a family-centered FHH-based feedback approach was more effective than an individual level approach in activating risk communication and behavioral encouragement pathways within family network systems. Next steps will examine how these social processes influence health behavior.
Bio: Dr. Koehly's research focuses on developing and applying social network methods to the study of complex social systems, such as families and communities. Her current research examines the influence of social context on coping responses to communication of hereditary risk and evaluates the effects of social context on improving health outcomes. To that end, she seeks to develop effective family-based interventions to encourage communication among family network members about genetic risk information, as well as to mobilize related social support processes that increase appropriate screening regimens and health-promoting behaviors. See also http://www.genome.gov/14514804.
Stability and Conformity in Scientists’ Research Strategies (with Jacob G. Foster and Andrey Rzhetsky)
Abstract: Scientific advance is profoundly affected by scientists' choice of research problems. We use a complex networks approach to consider this in the context of biomedical chemistry: To what degree do scientists introduce novel compounds and novel chemical relationships or repeat those defined previously? To what degree does their work consolidate existing subfields or bridge distant ones? Our findings show that even as the network of chemical knowledge grows dramatically, the distribution of these strategies remains remarkably stable. We demonstrate that high risk strategies involving the exploration of new chemical relationships are less prevalent in the scientific literature, produce more unexpected findings, have a greater risk of being ignored, but also a greater likelihood of achieving scientific appreciation and importance. I also present findings from research measuring novelty in the discovery process and its implications for assessing and encouraging scientific and technological innovation.
Bio: My current work explores how social and technical institutions shape knowledge—science, scholarship, law, news, religion—and how these understandings reshape the social and technical world. I particularly interested in the relation of markets to science and knowledge more broadly. I have studied how industry collaboration shapes the ethos, secrecy and organization of academic science; the web of individuals and institutions that produce innovations; and markets for ideas and their creators. I have also examined the impact of the Internet on knowledge in society. My work uses natural language processing, the analysis of social and semantic networks, statistical modeling, and field-based observation and interviews.
CNS and IVL Open House
Abstract: Open your laptops and demo your software. Bring posters to introduce your research questions and results. Feel free to visit the IVL/CNS Open House web site. There will be presentations of research and demos of diverse tools between 4:15p - 5:45p.
—Scott Weingart - Finding Scholarly Communities Then and Now
—Angela Zoss - Comprehension of Informetric Visualizations
—Peter Hook - Network Analysis of the Relationship Between Law School Courses: Data from an 18 Person Card Sort Exercise
—Stacy Kowalczyk - e-Science Data Environments: A View from the Lab Floor
—Michael D. Conover - Social Media & the Networked Public Sphere
—Dimitar Nikolov - Social Spam
—Yu Li - Research and Development in National Science Library
—Robert Light - Scholarly Database
—Chin Hua Kong - VIVO and Online Interactive Maps
—Joseph Biberstine - Science of Science (Sci2) Tool
Tools and Services
—Online Interactive Maps, MapSustain, NRN
—Scholarly Database of 25 million scholarly records, http://sdb.cns.iu.edu
—VIVO National Researcher Network, http://vivo.cns.iu.edu
—Network Workbench Tool and Community Wiki, http://nwb.cns.iu.edu
—Science of Science Tool and Portal, http://sci2.cns.iu.edu
—Epidemics Tool and Marketplace, http://epic.cns.iu.edu
Exploring the full eigenvalue spectrum of complex networks
Abstract: We present insights from analyzing the eigenvalues of the adjacency, normalized Laplacian, unnormalized Laplacian, and modularity matrices of a range of real-world graph and network models. This includes finding complete spectra for graphs with hundreds of thousands of nodes. In particular, we explore the origin for a few distinctive features, of the spectrum, including the presence of a large null-space of the adjacency matrix as well as a characteristic dip in the spectrum of the normalized Laplacian around the eigenvalue 1.
Bio: David Gleich is interested in treating network problems with matrix computations and using matrix computation for analyzing simulation data. He has served as a program committee member for ACM Hypertext and KDD conferences, SIAM's Data Mining conference, and reviewed articles for Applied Mathematics Letters, Physical Review Letters, and SIAM's Journal of Matrix Analysis and SIAM's Journal of Scientific Computing.
Studying the world and human activity by mining photo-sharing websites
Abstract: The popularity of photo-sharing websites has created immense collections of images online, with Flickr and Facebook alone hosting over 50 billion images. Each of these photos is an observation of what a small part of the world looked like at a particular point in time and space, as well as a record of where its photographer was and what he or she was paying attention to. When aggregated together and combined with the non-visual metadata available on photo sharing sites (including timestamps, geo-tags, and captions), these billions of photos are a rich source of information about the world and about human activity. In this talk I'll discuss some of our recent work in data mining and computer vision that aims to unlock this latent information from photo-sharing sites. In particular, I'll focus on two recent lines of work: reconstructing maps and 3-d models of the world from online photos, and studying how patterns of human travel are correlated with (and predictive of) social connections.
Bio: David Crandall is an Assistant Professor in the School of Informatics and Computing at Indiana University, where he is a member of the programs in Informatics, Computer Science, and Cognitive Science, and of the Center for Complex Networks and Systems Research. He received the Ph.D. in computer science from Cornell University in 2008 and the M.S. and B.S. degrees in computer science and engineering from the Pennsylvania State University in 2001. He was a Postdoctoral Research Associate at Cornell from 2008-2010, and a Senior Research Scientist with Eastman Kodak Company from 2001-2003.
Social Media and the Networked Public Sphere
Abstract: Social media platforms play an important role in shaping political discourse in America and around the world. In this talk we will explore a series of analyses examining the structure and content of political communication on Twitter surrounding the 2010 U.S. congressional elections. Using a combination of quantitative and qualitative methods we demonstrate that the network of political retweets exhibits a highly segregated partisan structure, with extremely limited connectivity between left- and right-leaning users. Surprisingly this is not the case for the user-to-user mention network, which is dominated by a single politically heterogeneous cluster of users in which ideologically opposed individuals interact at a much higher rate compared to the network of retweets. Building on this foundation, we develop a set of machine learning apparatuses that use network (94.5% accuracy) and text (90.8% accuracy) features to predict the political alignment of nearly 20,000 politically active Twitter users. Using these predictions as high-fidelity proxies for political alignment, we find that in contrast to the 2008 election cycle, right-leaning users allocate substantially more attention to political communication and exhibit a more tightly interconnected network structure, characteristics which facilitate the rapid and broad dissemination of political information. We conclude with an exploration of the policy focuses of these two groups, identifying key differences in the agendas of right- and left-leaning social media users ahead of the midterm elections.
Bio: Michael Conover is a Ph.D. student studying complex systems analysis at the Indiana University School of Informatics and Computing's Center for Complex Networks and Systems Research. Blending large scale computational analyses with media and political theory, Michael works to establish a body of literature that draws on the strengths of multiple fields while remaining accessible to diverse audiences.
nanoHUB.org powered by HUBzero — A Platform for Collaborative Research with Quantifiable Impact on Research and Education
Abstract: In June 2011 the National Science and Technology Council which reports to President Obama published Materials Genome Initiative for Global Competitiveness , writing "Accelerating the pace of discovery and deployment of advanced material systems will therefore be crucial to achieving global competitiveness in the 21st century." The Council goes on to say, "Open innovation will play a key role in accelerating the development of advanced computational tools. An existing system that is a good example of a first step toward open innovation is the nanoHUB, a National Science Foundation program run through the Network for Computational Nanotechnology." By serving a community of 175,000 users in the past 12 months with an ever-growing collection of 2,700 resources, including 212 simulation tools, nanoHUB.org has established itself as "the world's largest nanotechnology user facility" . nanoHUB.org is driving significant knowledge transfer among researchers and speeding transfer from research to education, quantified with usage statistics, usage patterns, collaboration patterns, and citation data from the scientific literature. Over 720 nanoHUB citations in the literature since the year 2001 resulting in a secondary citation h-index of 30 prove that high quality research by users outside of the pool of original tool developers can be enabled by nanoHUB processes. In addition to high-quality content, critical attributes of nanoHUB success are its open access, ease of use, utterly dependable operation, low-cost and rapid content adaptation and deployment, and open usage and assessment data. The open-source HUBzero software platform, built for nanoHUB and now powering many other hubs, is architected to deliver a user experience corresponding to these criteria. This presentation will provide an overview of nanoHUB, its success metrics and quantitative impact results.
 Quote by Mikhail Roco, Senior Advisor for Nanotechnology, National Science Foundation.
Bio: Gerhard Klimeck is the Director of the Network for Computational Nanotechnology at Purdue University and a Professor of Electrical and Computer Engineering. He guides the technical developments and strategies of nanoHUB.org. Previously he was the Technical Group Supervisor of the High Performance Computing Group and a Principal Scientist at the NASA Jet Propulsion Laboratory, Caltech and a member of technical staff at the Central Research Lab of Texas Instruments. Prof. Klimeck's research interest is in the modeling of nanoelectronic devices, parallel computing, genetic algorithms, and the Science of Science. Dr. Klimeck received his Ph.D. in 1994 on Quantum Transport Theory from Purdue University and his German electrical engineering degree on Experimental Non-Linear Optics in 1990 from Ruhr-University Bochum. Dr. Klimeck's work is documented in over 310 peer-reviewed publications and over 150 invited and 320 contributed conference presentations. He is a fellow of the Institute of Physics and a senior member of IEEE.
The Role of Social Networks and Boundary Spanning Organizations in Highly Innovative Communities
Abstract: Until very recently, few paid attention to the extent to which the social dynamics and cultural “grooves” of specific communities enabled or inhibited their capacity to recognize changing economic imperatives, integrate new knowledge into their understanding of their economic horizons, and develop effective strategies to renew or transform their economies. Emerging research on social networks and boundary spanning organizations suggests they are vital to the ability of communities to successfully build more nimble and innovative approaches to economic growth and job creation.
Bio: Mary Lindenstein Walshok, Ph.D., a sociologist, is Associate Vice Chancellor and Dean of the Extension Division at the University of California, San Diego. Over three decades, she has been a catalyst in building regional collaborations focused on high-tech cluster development (UCSD CONNECT) and cross-border synergies (the San Diego Dialogue) based on San Diego’s proximity to Mexico. She is the author of four books: Blue Collar Women, Knowledge Without Boundaries, Closing America’s Job Gap and Invention and Reinvention: The Evolution of San Diego’s Innovation Economy, forthcoming in StanfordUniversity Press. She has also authored more than 100 reports and articles on the regional competencies and social dynamics essential to building knowledge-based clusters and high-wage jobs. Walshok’s current research activities include serving as the Principal Investigator for the evaluation of 13 Generation I WIRED regions funded by the U.S. Department of Labor; a two-year NSF-funded project comparing the distinctive social dynamics of three innovation regions – Philadelphia, St. Louis, and San Diego; and a Lilly Foundation-funded assessment of efforts to sustain and grow the robust orthopedic device industry in Warsaw, Indiana. Walshok is the recipient of numerous awards including the distinguished Kellogg Foundation’s Leadership Fellowship and, most recently, induction into Sweden's Royal Order of the Polar Star. Active on boards of a number of arts and philanthropic organizations, Walshok chaired the boards of the San Diego Community Foundation during 2002-2004 and the International Community Foundation during 2007-2009. She is currently serving on the boards of the San Diego CONNECT, the La Jolla Playhouse, the United States-Mexico Foundation for Science, International Community Foundation, and the Girard Foundation.
Mapping Evaluation Models and Plans: Evaluation Protocols and Pathways
Abstract: Evaluations of programs and policies don’t occur in isolation. They are typically embedded in hierarchical systems of organizations (funders, administrative management, program management, program delivery) and the networks of models that guide evaluations can be usefully construed as interconnected conceptual systems encompassing program logic, research literature and evidence, measurement alternatives, etc. Increasingly the field of evaluation has been exploring systems approaches (evolutionary and ecological theories, network analysis, conceptual mapping, causal pathway modeling, etc.) to organize and represent evaluation models and plans and to network programs and people so they can more effectively function as a collective learning community. This talk presents work being conducted under an NSF grant to develop a general protocol for planning, implementing and utilizing an evaluation that connects such an effort to a broader ecosystem of evaluations. A key component of this work is the development of a complementary web-based cyberinfrastructure called the Netway that connects or networks causal pathway models from separate evaluations and enables identification of and communication between programs that share model features and evaluation needs. The evaluation protocol is briefly introduced, along with some of the systems thinking that is central to it. Then the Netway cyberinfrastructure is presented and some of the major challenges in designing it are introduced. In the general discussion, we hope to consider some of the possible directions that development of this approach and technology might take.
Bio: William Trochim is Professor of Policy Analysis and Management at Cornell University and Professor of Public Health at the Weill Cornell Medical Center. He is the Director of Evaluation for the Weill Cornell Clinical and Translational Science Center, the Director of Evaluation for Extension and Outreach at Cornell, and the Director of the Cornell Office for Research on Evaluation. His research focuses on the development and assessment of evaluation and research methods and their use for managing and enhancing science and biomedical research in the twenty-first century. Dr. Trochim has developed quasi-experimental alternatives to randomized experimental designs, including the regression discontinuity and regression point displacement designs. He created a structured conceptual modeling approach that integrates participatory group process with multivariate statistical methods to generate concept maps and models useful for theory development, planning and evaluation. He has published widely in the areas of applied research methods and evaluation including the books: Research Design for Program Evaluation: The Regression-Discontinuity Approach (1984), Concept Mapping for Planning and Evaluation (2005), Research Methods: The Concise Knowledge Base (2005), and the Research Methods Knowledge Base (2007). Dr. Trochim is currently conducting research with the National Institutes of Health on the evaluation of biomedical clinical and translational research and with the National Science Foundation on evaluating science, technology, engineering and mathematics (STEM) education programs. Dr. Trochim served for four years on the American Evaluation Association’s Board of Directors, was the chair of several AEA committees (electronic communications, public affairs), and was President of AEA (2008).
Academic Genealogy and the Development of Disciplines
Abstract: This talk will explore the use of academic genealogy networks to explore the formation, maturation, and intersection of disciplines. Using LIS as a case study, this presentation will explore the potential applications of these networks for providing empirical evidence to describe the development of disciplines. The talk focus on issues of maturation and interdisciplinarity and will review potential sources and tools for collecting and analyzing academic genealogy networks. Future research and broad applications for this topic will be discussed.
Bio: Cassidy R. Sugimoto is an Assistant Professor in the School of Library and Information Science at Indiana University Bloomington. She received her doctoral degree from the University of North Carolina at Chapel Hill. Sugimoto teaches and researches in the areas of research design and scholarly communication. The focus of her research is on the formation, maturation, and interaction of disciplines in the 20th century, from a scientometric approach. She is interested in the application of academic genealogy networks to inform studies of science.
Informatics for science-based groundwater management and socio-technical interfaces
Abstract: The field of Integrated Water Resource Management (IWRM) engages groups to explore collaborative decision making with the use of simulation-optimization models and decision support systems. Of particular interest is the implementation of IWRM approaches to groundwater systems. Groundwater, which makes up 98% of total available freshwater on Earth, is notably absent in formal education curricula and public communication about water resource availability. The result is a public that is unacquainted with one of society's most precious resources. Melding informatics with collaborative modeling, poses opportunities to educate an informed citizenry with the capacity to visually explore complex scientific topics and participate in substantive dialogue. This work presents results of topical analysis using management and policy texts as compared with modeled outputs from a Groundwater Decision Support System (GWDSS). A conceptual meta-model, or schema, has been developed to overlay policy objectives with feasible sets of groundwater response. The resultant network presents an interface with the capacity to span knowledge domains between planning contexts and scientific computation. Informatics visualizations provide a socio-technical interface to activate generative dialog and catalyze science-based deliberation.
Bio: Suzanne A. Pierce is a Research Assistant Professor with the Center for International Energy and Environmental Policy in the Jackson School of Geosciences and Assistant Director of the Digital Media Collaboratory in the Center for Agile Technology at The University of Texas at Austin. A trained hydrogeologist with a focus on participatory deliberation, Dr. Pierce has prior professional background as a Scientist with Sandia National Laboratories and as the Environmental Manager for one of the world's largest metals mines. Dr. Pierce adopts a scholar-practitioner approach to integrate science-based information with human organizational systems for application to groundwater management and energy-water problems. Resultant decision support systems link participatory modeling with simulation, optimization, and multi-stakeholder concerns. Current projects include development of hydroinformatics for sustainable aquifer yield in Central Texas and South Australia, along with evaluating perceptions of science at a geothermal basin in the Atacama Desert of Chile.
Text Classification of the Biomedical Literature
Abstract: Much of the research presently conducted in the biomedical domain relies on the induction of correlations and interactions from data. Because we ultimately want to increase our knowledge of the biochemical and functional roles of genes and proteins in organisms, there is a clear need to integrate the associations and interactions among biological entities that have been reported and accumulate in the literature and databases. Biomedical literature mining is an important informatics methodology for large scale information extraction from repositories of textual documents, as well as for integrating information available in various domain-specific databases and ontologies, ultimately leading to knowledge discovery. It helps us tap into the biomedical collective knowledge, and uncover relationships and interactions buried in the literature and databases, and even those inferred from global information but unreported in individual experiments. Our approach to literature mining is based on bottom-up, data-driven or bio-inspired methods, which we have applied to automatic discovery, classification and annotation of protein-protein and gene-disease interactions, pharmacokinetic data, protein sequence family and structure prediction, functional annotation of transcription data, enzyme annotation publications, and so on. In this talk I will focus on the lightweight Variable Trigonometric Threshold (VTT) linear classifier we developed for biomedical text classification, and which we have applied successfully to the protein-protein interaction literature. The latest version of this method utilizes a number of features obtained via Named Entity Recognition (NER) and dictionary tools. We will discuss our results with this classifier in the recent BioCreative challenges where it has performed very well. We will also contrast this method with ongoing research in our group to develop biologically-inspired methods for biomedical text classification.
Bio: Luis M. Rocha is an Associate Professor and director of the Complex Systems graduate Program in Informatics, member of the Center for Complex Networks and Systems, and core faculty of the Cognitive Science Program, at the Indiana University, Bloomington, USA. He is also the director of the FLAD Computational Biology Collaboratorium and in the direction of the associated PhD program in Computational Biology at the Instituto Gulbenkian da Ciencia, Portugal. His research is on complex systems, computational biology, artificial life, embodied cognition and bio-inspired computing. He received his Ph.D in Systems Science in 1997 from the State University of New York at Binghamton. From 1998 to 2004 he was a permanent staff scientist at the Los Alamos National Laboratory; where he founded and led a Complex Systems Modeling Team during 1998-2002 and part of the Santa Fe Institute research community. He has organized major conferences in the field such as the Tenth International Conference on the Simulation and Synthesis of Living Systems (Alife X) and the Ninth European Conference on Artificial Life (ECAL 2007). He has published many articles in scientific and technology journals, and has been the recipient of several scholarships and awards. Details about his research and teaching are available on his web site: http://informatics.indiana.edu/rocha.
The Trouble with House Elves: Challenges for a Computational Folkloristics
Abstract: Folklore collections are generally indexed according to the dictum, "one story, one classifier." This approach to collection indexing was generally serviceable as long as the research questions aligned with indexing practices, and as long as the collections were relatively small. As research questions changed and collections became much larger--including stories from thousands or tens of thousands of storytellers, and constituting tens of thousands of pages or hours of recording--these simple finding-aids were revealed to be inadequate for addressing even the simplest needs of researchers. Using a 19th century collection of Danish folklore, we explore the use of network analysis tools for search and discovery. We show how a tuned Markov Clustering (MCL) algorythm can be (a) used to discover stories needed to address research questions not considered by the initial indexing scheme and (b) find previously unrecognized affinities among stories that can lead to new research questions. A second part of the presentation focuses on how to visualize geographic relations between individuals and their story repertoires. The audience is reminded not to present clothing to the house elf accompanying the lecturer.
Bio: Timothy R. Tangherlini teaches folklore, literature and cultural studies at the University of California, where he is a professor in Scandinavian Section, and the Department of Asian Languages and Cultures. He is the author of Interpreting Legend: Danish Storytellers and their Repertoires (1994), Talking Trauma. Paramedics and Their Stories (1998), and the co-editor of Nationalism and the Construction of Korean Identity (1999), and Sitings. Critical Approaches to Korean Geography (2008). He has also produced or co-produced two documentary films, Talking Trauma: Storytelling Among Paramedics (1994) and Our Nation. A Korean Punk Rock Community (2002). His current work focuses on computation and the humanities. In particular, he has focused on using GIS to discover patterns in folklore collections, and network analysis techniques to address problems of classification. Links to this work can be found at http://tango.bol.ucla.edu/#online. He directed the NEH's "Networks and Network Analysis for the Humanities" Summer Institute for Advanced Topics in Digital Humanities at NSF's Institute for Pure and Applied Mathematics at UCLA in summer 2010. His research has been supported by grants from the National Science Foundation, the National Endowment for the Humanities, the Fulbright Foundation, the Nordic Council of Ministers, the John Simon Guggenheim Memorial Foundation, the American Council of Learned Societies, The Henry Luce Foundation, the American Scandinavian Foundation, and Google.
Modeling Social Contagion in Networks
Abstract: Social networks have introduced novel challenges in statistical modeling, and therefore novel pitfalls. We illustrate this by using a series of recent highly touted papers by Christakis and Fowler that claim to have demonstrated the existence of transmission via social networks of various personal characteristics, including obesity, smoking cessation, happiness, and loneliness. Those papers also assert that such influence extends to three degrees of separation in social networks. However, their statistical methodology is seriously flawed at many levels, as we explain.
Also published as: Lyons, Russell (2011) "The Spread of Evidence-Poor Medicine via Flawed Social-Network Analysis," Statistics, Politics, and Policy: Vol. 2 : Iss. 1, Article 2.
Bio: Russell Lyons obtained his Ph.D. at the University of Michigan in 1983, where he specialized in harmonic analysis. He then spent two postdoctoral years in Paris and 5 years as Assistant Professor at Stanford University. In 1988, his field of research switched primarily to probability theory. He moved to Indiana University in 1990, where he has been since, except for two years at Georgia Tech. He has also spent research leaves at the University of New South Wales, the University of Wisconsin, the University of Lyon, Hebrew University of Jerusalem, the Weizmann Institute of Science, the University of California (Berkeley), and Microsoft Research. Lyons is Professor of Mathematics and Adjunct Professor of Statistics. His research now is primarily in discrete probability and its connections to other areas of mathematics, including ergodic theory, geometric group theory, and combinatorics. He is also very interested in the teaching of statistics and has begun doing some research in statistics. He is writing a graduate-level textbook, Probability on Trees and Networks.
Building Networks of Action Situations for the Analysis of Policy Processes and Institutions
Abstract: Within the Institutional Analysis and Development (IAD) framework, the concept of an action situation generalizes a game to allow for endogenous changes in its rules. This paper re-visits this core concept to explore its potential for serving as the foundation for a systematic approach to the construction of more elaborate models of complex policy networks in which overlapping sets of actors have the ability to influence the rules under which their strategic interactions take place. Networks of adjacent action situations can be built on the basis of the seven distinct types of rules that define an action situation or by representing generic governance tasks identified in related research on local public economies. The potential of this extension of the IAD framework is demonstrated with simplified network representations of three diverse policy areas (Maine lobster fisheries, international development assistance, and the contribution of faith-based organizations to U.S. welfare policy). See also Michael D. McGinnis. 2011. “Networks of Adjacent Action Situations in Polycentric Governance,” Policy Studies Journal 39 (1) (March 2011), 45-72. Pre-print is available at http://php.indiana.edu/~mcginnis/naas.pdf.
Bio: Michael D. McGinnis is Professor in the Department of Political Science at Indiana University, Bloomington. He serves as Director of the Workshop in Political Theory and Policy Analysis, an inter-disciplinary research and teaching center focused on the study of institutions, development, and governance. The Workshop was initially established in 1973 by Vincent and Elinor Ostrom, and its continuing importance was dramatically recognized when Elinor Ostrom was awarded the 2009 Nobel Prize in Economic Sciences. McGinnis received a B.S. in mathematics from the Ohio State University in 1980 and a Ph.D. in political science from the University of Minnesota in 1985, and he has worked at IU ever since. In his early research Prof. McGinnis used game theory to model arms races, alliances, wars, peace negotiations, and other interactions between domestic and international politics. He has published several articles in political science and international relations journals, as well as chapters in edited volumes. He is co-author, with John T. Williams, of Compound Dilemmas: Democracy, Collective Action, and Superpower Rivalry (University of Michigan Press, 2001) and editor of three volumes of readings on governance issues written by scholars associated with the Workshop in Political Theory and Policy Analysis. He was co-editor of International Studies Quarterly (1994-98).
Overview of the Indiana CTSI program
William Hetrick and William Barnett
Abstract: The Indiana Clinical and Translational Sciences Institute was founded three years ago to initiate a strategic translational approach to health care research across the State of Indiana. The Institute is a novel partnership of IU, Purdue, and Notre Dame along with hospitals, industry, government, and community organizations. Drs. Hetrick and Barnett will discuss the specific CTSI programs for basic and clinical research, funding, education, and community outreach.
Bio: William Hetrick's research focuses on brain-behavior relationships in psychopathology, including schizophrenia, bipolar disorder, and autism.
Bill Barnett leads the life science practice for research technologies, where he coordinates relevant IT services for all IU campuses and biomedical applications development. He also leads the development of novel cyberinfrastructures for analytics, data management, and virtual organizations for the research enterprise, including the Indiana CTSI HUB. Dr. Barnett has his degree from Boston University in archaeology, specializing in the origins of agriculture and the socioeconomics of ceramic production and distribution in the western Mediterranean.
Abstract: This talk opens with a discussion of major changes in the landscape of science that pose challenges and opportunities for the design of effective data analysis and visualization tools. We then present a set of desirable features for designing plug-and-play "macroscope" tools and review related work. Next, we explain the design of a software architecture that extends the Open Services Gateway Initiative Framework (OSGi) and uses the Cyberinfrastructure Shell (CIShell) (http://cishell.org) to support the easy integration of new and existing algorithms and their synergistic combination. The OSGi/CIShell software framework is at the core of five plug-and-play tools that serve different scientific communities: the IVC was developed for research end education in information visualization; the Network Workbench (NWB) tool was designed for large-scale network analysis, modeling, and visualization; the Science of Science (Sci2) tool is used by science of science (policy) researchers; the EpiC tool is under development for use by epidemiologists; and TEXTrend supports the analysis of text. We present two of these tools in detail: the NWB tool (http://nwb.cns.iu.edu) and the Sci2 Tool (http://sci2.cns.iu.edu). The talk concludes with a discussion of related efforts and an outlook into the not-too-distant future.
Bio: Katy Börner is the Victor H. Yngve Professor of Information Science at the School of Library and Information Science, Adjunct Professor at the School of Informatics and Computing, Adjunct Professor at the Department of Statistics the College of Arts and Sciences, Core Faculty of Cognitive Science, and Founding Director of the Cyberinfrastructure for Network Science Center (http://cns.iu.edu) at Indiana University. She is the curator of the Places & Spaces: Mapping Science exhibit (http://scimaps.org). Her research focuses on the development of data analysis and visualization techniques for information access, understanding, and management. She is the co-editor of the Springer book on ‘Visual Interfaces to Digital Libraries’ and of a special issue of PNAS on ‘Mapping Knowledge Domains’ (2004). Her new book ‘Atlas of Science' was published by MIT Press in 2010 (http://scimaps.org/atlas). She holds a MS in Electrical Engineering from the University of Technology in Leipzig, 1991 and a Ph.D. in Computer Science from the University of Kaiserslautern, 1997.
Mapping Interactions Within the Evolving Science of Science and Innovation Policy Community
Abstract: The Science of Science & Innovation Policy (SciSIP) program at the National Science Foundation (NSF) supports research designed to advance the scientific basis of science and innovation policy. The program was established at NSF in 2005 in response to a call from Dr. John Marburger III, then science advisor to the U.S. President, for a “science” of science policy. It has co-funded 162 awards that aim to develop, improve and expand models, analytical tools, data and metrics that can be directly applied in the science policy decision making process. The long-term goals of the SciSIP program are to provide a scientifically rigorous and quantitative basis for science policy and to establish an international community of practice. The program has an active listserv that, as of January 2011, has almost 700 members from academia, government, and industry.
This talk will summarize a recent study that analyzed all SciSIP awards made so far in an attempt to identify existing collaboration networks and co-funding relations between SciSIP and other areas of science. In addition, listserv messages were downloaded and analyzed to derive complementary discourse information. Key results include evidence of rich diversity in communication and funding networks and effective strategies for interlinking researcher and science policy makers, prompting discussion, and resource sharing.