February 4, 2014

An IBM PureData N1001 computer has been acquired and installed at Dalhousie for use in the Institute's research projects. IBM describes this machine as,

"a high-performance, scalable, massively parallel system that enables clients to perform analytics on enormous data volumes. Big data volumes are made simpler, faster and more accessible. This system, powered by Netezza technology, is designed specifically for running complex analytics on very large data volumes, orders of magnitude faster than competing solutions."

Thanks to  CFI LOI funding we are looking forward to the use of this powerful data handling system in the Institute projects.

December 9, 2013

Rob Warren gave a presentation at Curtin University, Australia, as part of their Adventures in Culture and Technology Series.

Ask not what you can do for Linked Open Data but what Linked Open Data can do for you

Digital Humanities scholars have long been hampered by the twin problems of getting the data into digital form and then managing ever-increasing amounts of it. Too often, the data behind the research becomes prisoner of a ‘research portal’ or is lost on someone’s laptop. In many ways the most successful data management tool so far is the spreadsheet – a 40 year-old technology!

This talk is about linked open data, or the semantic web, an approach to the management of data that is showing promise for researchers, libraries and archives. The talk is non-technical and focuses on explaining how real-world research data problems can be solved. These include the identity of historical persons, dealing with incomplete or false data; identifying or referencing lost geographical locations and encouraging the serendipitous reuse of data in other projects.

Real-world examples of problematic data from the Great War will be shown from the Muninn Project and the solutions using linked open data approaches.

November 29, 2013

The Institute is proud to be sending Postdoctoral Fellow, Rob Warren off to Australia to work for a period of time at Curtin University with Erik Champion and Gordon Davies. 

Dr. Rob Warren (left) with Dr. Erik Champion (third from left) and the other visiting fellows.

Dr. Warren will be working on realistic simulations of the Great War and aims to integrate extremely detailed semantic web databases (geo-data, astronomical data, archival material) with immersive visualisation technology such as Curtin’s iDome, to create ultra-­realistic simulations of the Great War based on actual maps, events and people that were there.   

The project will also provide an example of how humanities and the "Big Data" sciences can work together in disseminating major cultural events, archival information, and cutting-edge technology and information retrieval to the general public. There are significant research problems in the visualization of Big Data that ultra‐realistic simulations may provide novel solutions to.

A wireframe terrain prototype of the German Trenches of the Hindenburgh Line in 1916 on the Western Front.
November 6, 2013

A Paper entitled "A data science specialization in the bachelor of computer science" by E. Milios, P. Bodorik, S. Matwin, A. Rau-Chaplin and M. Shepherd was presented by S. Matwin at Big Data and Analytics EdCon 2013 in Las Vegas

October 4, 2013

Stan Matwin, Director, and Andrew Rau-Chaplin, member of the Institute Executive Committee have been invited to present papers at the 2013 IEEE International Conference on Big Data October 6-9 in Santa Clara, California. The Conference provides a leading forum for disseminating the latest research in Big Data Research, Development, and Application.

Papers  to be presented are:

Xuan Liu, Xiaoguang Wang, Stan Matwin, Nathalie Japkowicz, “Meta-learning for Large Scale Machine Learning with MapReduce”.

A. Rau-Chaplin, B. Varghese, D. Wilson, Z. Yao, and N. Zeh, "QuPARA: Query-Driven Large-Scale Portfolio Aggregate Risk Analysis on MapReduce".

F. Dehne, Q. Kong, A. Rau-Chaplin, H. Zaboli, and R. Zhou, "A Distributed Tree Data Structure For Real-Time OLAP On Cloud Architectures".


October 3, 2013

On October 3, the Institute for Big Data Analytics hosted the members of the "Consortium de Recherche et d'Innovation en Aérospatiale au Québec" (CRIAQ) for a session on Big Data.  The Consortium is a non-profit organization established in 2002 with the financial support of the Québec government. Its mission is to increase the competitiveness of the aerospace industry, and enhance the collective knowledge base in aerospace through improved education and training of students.  As part of their Atlantic Tour, they participated in 2 days of workshops in Halifax, including the session on Big Data.

September 26, 2013

At the invitation of Giesecke & Devrient (G&D), a leading German Technology company, Dr. Stan Matwin delivered a presentation on the topic of Big Data and Corporate Social Responsibility to the international “IDENTITY / Talk in the Tower” series. G&D was motivated to set up this forum not only by their own business interests in securing values and digital transactions worldwide and in creating products which protect the authenticity of identities in the real and digital world but also through a desire to support public debate on the issue.

Specifically in relation to big data, G&D had set up a task force to focus on big data and data minimization. "Big Data comes with chances and risks for users. The challenges include the management of identity related data within Big Data. Data minimization could be a solution to protect identities successfully. What impact has Big Data on our identity and how can data minimization be implemented? How can technology support data minimization?"

Dr. Matwin spoke to the task force and addressed the questions, “Where are the ethical problems? What does society expect from companies when it comes to a responsible use of Big Data? How can users be better educated about chances and challenges when it comes to Big Data?”


David Langstroth

August 22, 2013

The Institute for Big Data Analytics is looking forward to collaborating with Montreal based company "Wildcard" to tackle an interesting research problem over a six month period, starting in September.  Wildcard is a leading provider of loyalty cards and mobile apps for nightlife entertainment venues. As part of its loyalty offerings, Wildcard will recommend evening entertainment venues to its subscribers based on a matching process.  If the matching process does not work well, then subscriber fatigue will increase the probability that any future suggestions will be ignored, and may result in the loss of the subscriber.  The need for precision makes this problem area complex; and in addition the process must be relatively fast to keep up with changes at different venues, and the need to provide hundreds of thousands of recommendations per hour.

The research is funded through NSERC's Engage Grant program and will involved Dr. Stan Matwin, a postdoctoral fellow from Dalhousie and the relevant personnel from Wildcard.


David Langstroth