Welcome to Research Repository UCD

Research Repository UCD is a digital collection of open access scholarly research publications from University College Dublin. Research Repository UCD collects, preserves and makes freely available publications including peer-reviewed articles, working papers and conference papers created by UCD researchers. Where material has already been published it is made available subject to the open-access policies of the original publishers. This service is maintained by UCD Library.

Most downloaded
  • Publication
  • Publication
    Corporate governance, accountability and mechanisms of accountability : an overview
    Purpose – This paper reviews traditional corporate governance and accountability research, to suggest opportunities for future research in this field. The first part adopts an analytical frame of reference based on theory, accountability mechanisms, methodology, business sector/context, globalisation and time horizon. The second part of the paper locates the seven papers in the special issue in a framework of analysis showing how each one contributes to the field. The paper presents a frame of reference which may be used as a 'roadmap' for researchers to navigate their way through the prior literature and to position their work on the frontiers of corporate governance research. Design/methodology/approach – The paper employs an analytical framework, and is primarily discursive and conceptual. Findings – The paper encourages broader approaches to corporate governance and accountability research beyond the traditional and primarily quantitative approaches of prior research. Broader theoretical perspectives, methodological approaches, accountability mechanism, sectors/contexts, globalisation and time horizons are identified. Research limitations/implications – Greater use of qualitative research methods are suggested, which present challenges particularly of access to the “black box” of corporate boardrooms. Originality/value – Drawing on the analytical framework, and the papers in the special issue, the paper identifies opportunities for further research of accountability and corporate governance.
      33130Scopus© Citations 289
  • Publication
    Elderly care in Ireland - provisions and providers
    (University College Dublin. School of Social Justice, 2010-04) ;
  • Publication
  • Publication
    Equality in education : an equality of condition perspective
    (Sage Publications, 2005) ;
    Transforming schools into truly egalitarian institutions requires a holistic and integrated approach. Using a robust conception of 'equality of condition', we examine key dimensions of equality that are central to both the purposes and processes of education: equality in educational and related resources; equality of respect and recognition; equality of power; and equality of love, care and solidarity. We indicate in each case some of the major changes that need to occur if we are to promote equality of condition. Starting with inequalities of resources, and in particular with inequalities tied to social class, we argue for abandoning rigid grouping policies, challenging the power of parents in relation to both selection and grouping, and changing curricula and assessment systems to make them more inclusive of the wide range of human intelligences. In relation to respect and recognition, we call for much more inclusive processes for respecting differences, not only in schools' organizational cultures, but also in their curriculum, pedagogy and assessment systems. Regarding inequalities of power, we call for democratization of both teacher-student relationships and school and college organization. For promoting equality of love, care and solidarity, we argue that schools need to develop an appreciation of the intrinsic role that emotions play in the process of teaching and learning, to provide a space for students and teachers to talk about their feelings and concerns, and to devise educational experiences that will enable students to develop their emotional skills or personal intelligences as a discrete area of human capability.
      23031Scopus© Citations 133
  • Publication
    Discretionary disclosure strategies in corporate narratives : incremental information or impression management?
    (University of Florida. Fisher School of Accounting, 2007) ;
    The purpose of this paper is to review and synthesize the literature on discretionary narrative disclosures. We explore why, how, and whether preparers of corporate narrative reports use discretionary disclosures in corporate narrative documents and why, how, and whether users react thereto. To facilitate the review, we provide three taxonomies based on: the motivation for discretionary narrative disclosures (opportunistic behavior, i.e. impression management, versus provision of useful incremental information); the research perspective (preparer versus user); and seven discretionary disclosure strategies. We also examine the whole range of theoretical frameworks utilized by prior research, and we put forward some suggestions for future research.
  • Publication
    Using Twitter to recommend real-time topical news
    Recommending news stories to users, based on their preferences,has long been a favourite domain for recommender systems research. In this paper, we describe a novel approach to news recommendation that harnesses real-time micro-blogging activity, from a service such as Twitter, as the basis for promoting news stories from a user's favourite RSS feeds. A preliminary evaluation is carried out on an implementation of this technique that shows promising results.
      21301Scopus© Citations 334
  • Publication
    From asset based welfare to welfare housing? The changing function of social housing in Ireland
    (Routledge, 2011) ;
    This article examines a distinctive and significant aspect of social housing in Ireland – its change in function from an asset-based role in welfare support to a more standard model of welfare housing. It outlines the nationalist and agrarian drivers which expanded the initial role of social housing beyond the goal of improving housing conditions for the poor towards the goal of extending home ownership and assesses whether this focus made it more similar to the ‘asset based welfare’ approach to housing found in south-east Asia than to social housing in western Europe. From the mid-1980s, the role of Irish social housing changed as the sector contracted and evolved towards the model of welfare housing now found in many other western countries. Policy makers have struggled to address the implications of this transition and vestiges of social housing’s traditional function are still evident, consequently the boundaries between social housing, private renting and home ownership in Ireland have grown increasingly nebulous.
      20853Scopus© Citations 29
  • Publication
    Constructive approaches towards water treatment works sludge management : an international review of beneficial re-uses
    (Taylor & Francis, 2007-03) ;
    Till date, virtually all known drinking water processing systems generate an enormous amount of residual sludge, and what else to do with this rapidly increasing 'waste' stream in an economic and environmentally sustainable manner remains a significant environmental issue. Perhaps, the realization of this fact has led to series of concerted efforts aimed at beneficial re-uses in an effort to close the loop between efficient water treatment and sustainable sludge management. This paper therefore presents a comprehensive review of available literature on attempts at beneficial reuses of water treatment plant sludge, in an effort to provide a compendium of recent and past developments, and update our current state of knowledge. Four broad categories of uses, which included over eleven possible ways in which waterworks sludges can be reused were identified and examined. Obvious advantages of such reuse options were highlighted and knowledge gaps identified. Future issues that will assist in the development of sustainable waterworks sludge management options with a multi-prong approach were equally discussed.
      19763Scopus© Citations 378
  • Publication
    Expansive cements and soundless chemical demolition agents : state of technology review
    Expansive cements and soundless chemical demolition agents (SCDAs) were first introduced in the early 1970s but failed to gain widespread adoption for selective removal of rock and concrete due to their proprietary nature and a lack of usage guidelines. Nearly 40 years later, the patents have expired, and a large number of competitive products have entered the market. These factors coupled with a heightened interest in their potential environmental benefits have greatly expanded their usage. Specifically, these chemicals can be introduced into a pattern of small, drilled holes in concrete and/or rock. After a specific period (usually less than 24 hours), the in-situ material will crack sufficiently that it can be removed without the use of traditional explosives or further percussive efforts. The products generate substantially less noise and vibration than usually associated with the removal of rock and concrete. This paper provides a state-of-the-technology review of five available products. The focus is on the proposed applicability of various products under specific conditions. Special attention is paid to the viability of such agents under varying temperatures and with materials of particular strengths.
  • Publication
    Clustering with the multivariate normal inverse Gaussian distribution
    Many model-based clustering methods are based on a finite Gaussian mixture model. The Gaussian mixture model implies that the data scatter within each group is elliptically shaped. Hence non-elliptical groups are often modeled by more than one component, resulting in model over-fitting. An alternative is to use a mean–variance mixture of multivariate normal distributions with an inverse Gaussian mixing distribution (MNIG) in place of the Gaussian distribution, to yield a more flexible family of distributions. Under this model the component distributions may be skewed and have fatter tails than the Gaussian distribution. The MNIG based approach is extended to include a broad range of eigendecomposed covariance structures. Furthermore, MNIG models where the other distributional parameters are constrained is considered. The Bayesian Information Criterion is used to identify the optimal model and number of mixture components. The method is demonstrated on three sample data sets and a novel variation on the univariate Kolmogorov–Smirnov test is used to assess goodness of fit.
      17469Scopus© Citations 59
  • Publication
    Inequality and crime
    (MIT Press, 2000-11)
    This paper considers the relationship between inequality and crime using data from urban counties. The behavior of property and violent crime are quite different. Inequality has no effect on property crime but a strong and robust impact on violent crime, with an elasticity above 0.5. By contrast, poverty and police activity have significant effects on property crime, but little on violent crime. Property crime is well explained by the economic theory of crime, while violent crime is better explained by strain and social disorganization theories.
      17191Scopus© Citations 418
  • Publication
    Agent-based coordination for the sensor web
    The approach described advocates the use of a multi-agent system, and specifically the use of multi-agent distributed constraint optimisation algorithms. Developing software for low powered sensing devices introduces several problems to be addressed; the most obvious being the limited computational resources available. In this paper we discuss an implementation of ADOPT, a pre-existing algorithm for distributed constraint optimisation, and describe how it has been integrated with a reflective agent platform developed for resource constrained devices, namely Agent Factory Micro Edition (AFME). The usefulness of this work is illustrated through the canonical multi-agent coordination problem, namely graph colouring.
      16016Scopus© Citations 2
  • Publication
    Curriculum Design in Higher Education: Theory to Practice
    (University College Dublin. Teaching and Learning, 2015-09)
    This eBook emphasises the theory to practice of curriculum design in higher education. The book focuses on programme (not module) level of design; incorporates face-to-face, blended and online curricula; attempts to link theory to practice by giving some practical resources and/or exercises; draws the author's experiences of working and researching into curriculum design in the Irish higher education sector; is aimed at all staff involved in curriculum design, including academic staff (faculty), institutional managers, educational developers and technologists, support staff, library staff and curriculum researchers; is primarily drawn from literature and experiences in the higher education sector, however those in adult and further education may also find it useful. The structure of this book is based on a curriculum design process that the author has developed as part of her experience and research on curriculum design. 
  • Publication
    Visualization in sporting contexts : the team scenario
    Wearable sensor systems require an interactive and communicative interface for the user to interpret data in a meaningful way. The development of adaptive personalization features in a visualization tool for such systems can convey a more meaningful picture to the user of the system. In this paper, a visualization tool called Visualization in Team Scenarios (VTS), which can be used by a coach to monitor an athlete’s physiological parameters, is presented. The VTS has been implemented with a wearable sensor system that can monitor players’ performance in a game in a seamless and transparent manner. Using the VTS, a coach is able to analyze the physiological data of athletes generated using select wearable sensors, and subsequently analyse the results to personalize training schedules thus improving the performance of the players.
  • Publication
    Provision of childcare services in Ireland
    (University College Dublin. School of Social Justice, 2008-03) ;
    External report commissioned by and presented to the EU Directorate-General Employment and Social Affairs, Unit G1 'Equality between women and men'
  • Publication
    Michael White's narrative therapy
    (Springer Verlag, 1998)
    A systematized description of a number of practices central to Michael Whites' narrative approach to therapy is given. These include collaborative positioning of the therapist, externalizing the problem, excavating unique outcomes, thickening the new plot, and linking the new plot to the past and the future. The practices of remembering and incorporation, using literary means to achieve therapeutic ends, and facilitating taking-it-back practices are also described. A number of questions are given which may be useful for those concerned with narrative therapy to address.
      15152Scopus© Citations 103
  • Publication
    Financial statement fraud : some lessons from US and European case studies
    (Wiley-Blackwell, 2007-07) ;
    This paper studies 14 companies which were subject to an official investigation arising from the publication of fraudulent financial statements. The research found senior management to be responsible for most fraud. Recording false sales was the most common method of financial statement fraud. Meeting external forecasts emerged as the primary motivation. Management discovered most fraud, although the discovery was split between incumbent and new management.
      14896Scopus© Citations 37
  • Publication
    The effectiveness of family therapy and systemic interventions for child-focused problems
    (Wiley, 2009-02)
    This review updates a similar paper published in the Journal of Family Therapy in 2001. It presents evidence from meta-analyses, systematic literature reviews and controlled trials for the effectiveness of systemic interventions for families of children and adolescents with various difficulties. In this context, systemic interventions include both family therapy and other family-based approaches such as parent training. The evidence supports the effectiveness of systemic interventions either alone or as part of multimodal programmes for sleep, feeding and attachment problems in infancy; child abuse and neglect; conduct problems (including childhood behavioural difficulties, ADHD, delinquency and drug abuse); emotional problems (including anxiety, depression, grief, bipolar disorder and suicidality); eating disorders (including anorexia, bulimia and obesity); and somatic problems (including enuresis, encopresis, recurrent abdominal pain, and poorly controlled asthma and diabetes).
      14277Scopus© Citations 152
  • Publication
    Focus groups versus individual interviews with children : A comparison of data
    (Routledge (Taylor & Francis), 2006) ;
    In recent years there has been an increase in the use of qualitative data collection techniques in research with children. Among the most common of these methods are focus groups and individual interviews. While many authors claim that focus groups have advantages over individual interviews, these claims have not been tested empirically with children. The present study reports on the use of focus groups and interviews to collect qualitative data from 116 children in three age groups, with mean ages of 8.4, 11.5 and 14.3 years. The children were randomly allocated to participate in either focus groups or individual interviews where they were presented with identical material and questions relating to their beliefs about peers with psychological disorders. In line with previous research, the interviews produced significantly more relevant and unique ideas about the causes of these disorders than the focus groups, but the latter gave rise to greater elaboration of ideas. The participating children showed no significant difference in their preference for one method over the other. Thus, whether to choose individual interviews or focus groups is likely to depend on the nature of the research question in any given study.
      14049Scopus© Citations 42
Recent Submissions
  • Publication
    Europäische Chancen und Schweizer Hoffnungen
    (Arbeitsgemeinschaft zur Förderung der politischen Bildung, 2023-11-15)
    Seit 2012 können EU-Bürger:innen direktdemokratische Initiativen unterschreiben und damit die EU-Kommission auffordern, ihre Politik zu ändern. Eine Europäische Bürgerinitiative (EBI) kommt zustande, wenn die Initiierenden eine Million Unterschriften zusammenbringen. Zwar muss die Kommission danach keine EU-weite Volksabstimmung organisieren, dennoch muss sie die EBI ernst nehmen und entsprechende Maßnahmen prüfen. Welche konkreten Erfahrungen haben europäische Gewerkschaften bislang mit EBIs gemacht?
  • Publication
    Towards the Leveraging of Data Deduplication to Break the Disk Acquisition Speed Limit
    Digital forensic evidence acquisition speed is traditionally limited by two main factors: the read speed of the storage device being investigated, i.e., the read speed of the disk, memory, remote storage, mobile device, etc.), and the write speed of the system used for storing the acquired data. Digital forensic investigators can somewhat mitigate the latter issue through the use of high-speed storage options, such as networked RAID storage, in the controlled environment of the forensic laboratory. However, traditionally, little can be done to improve the acquisition speed past its physical read speed from the target device itself. The protracted time taken for data acquisition wastes digital forensic experts' time, contributes to digital forensic investigation backlogs worldwide, and delays pertinent information from potentially influencing the direction of an investigation. In a remote acquisition scenario, a third contributing factor can also become a detriment to the overall acquisition time - typically the Internet upload speed of the acquisition system. This paper explores an alternative to the traditional evidence acquisition model through the leveraging of a forensic data deduplication system. The advantages that a deduplicated approach can provide over the current digital forensic evidence acquisition process are outlined and some preliminary results of a prototype implementation are discussed.
      8Scopus© Citations 1
  • Publication
    Behavioral Service Graphs: A Big Data Approach for Prompt Investigation of Internet-Wide Infections
    The task of generating network-based evidence to support network forensic investigation is becoming increasingly prominent. Undoubtedly, such evidence is significantly imperative as it not only can be used to diagnose and respond to various network-related issues (i.e., performance bottlenecks, routing issues, etc.) but more importantly, can be leveraged to infer and further investigate network security intrusions and infections. In this context, this paper proposes a proactive approach that aims at generating accurate and actionable network-based evidence related to groups of compromised network machines. The approach is envisioned to guide investigators to promptly pinpoint such malicious groups for possible immediate mitigation as well as empowering network and digital forensic specialists to further examine those machines using auxiliary collected data or extracted digital artifacts. On one hand, the promptness of the approach is successfully achieved by monitoring and correlating perceived probing activities, which are typically the very first signs of an infection or misdemeanors. On the other hand, the generated evidence is accurate as it is based on an anomaly inference that fuses big data behavioral analytics in conjunction with formal graph theoretical concepts. We evaluate the proposed approach as a global capability in a security operations center. The empirical evaluations, which employ 80 GB of real darknet traffic, indeed demonstrates the accuracy, effectiveness and simplicity of the generated network-based evidence.
  • Publication
    On the Benefits of Information Retrieval and Information Extraction Techniques Applied to Digital Forensics
    (Springer, 2016-08-30) ;
    Many jurisdictions suffer from lengthy evidence processing backlogs in digital forensics investigations. This has negative consequences for the timely incorporation of digital evidence into criminal investigations, while also affecting the timelines required to bring a case to court. Modern technological advances, in particular the move towards cloud computing, have great potential in expediting the automated processing of digital evidence, thus reducing the manual workload for investigators. It also promises to provide a platform upon which more sophisticated automated techniques may be employed to improve the process further. This paper identifies some research strains from the areas of Information Retrieval and Information Extraction that have the potential to greatly help with the efficiency and effectiveness of digital forensics investigations.
      7Scopus© Citations 3
  • Publication
    Forensic Analysis and Remote Evidence Recovery from Syncthing: An Open Source Decentralised File Synchronisation Utility
    Commercial and home Internet users are becoming increasingly concerned with data protection and privacy. Questions have been raised regarding the privacy afforded by popular cloud-based file synchronisation services such as Dropbox, OneDrive and Google Drive. A number of these services have recently been reported as sharing information with governmental security agencies without the need for warrants to be granted. As a result, many users are opting for decentralised (cloudless) file synchronisation alternatives to the aforementioned cloud solutions. This paper outlines the forensic analysis and applies remote evidence recovery techniques for one such decentralised service, Syncthing.
      7Scopus© Citations 5
  • Publication
    Digital Evidence Bag Selection for P2P Network Investigation
    (Springer, 2013-09-04) ;
    The collection and handling of court admissible evidence is a fundamental component of any digital forensic investigation. While the procedures for handling digital evidence take much of their influence from the established policies for the collection of physical evidence, due to the obvious differences in dealing with non-physical evidence, a number of extra policies and procedures are required. This paper compares and contrasts some of the existing digital evidence formats or “bags” and analyses them for their compatibility with evidence gathered from a network source. A new digital extended evidence bag is proposed to specifically deal with evidence gathered from P2P networks, incorporating the network byte stream and on-the-fly metadata generation to aid in expedited identification and analysis.
      6Scopus© Citations 6
  • Publication
    Investigating Cybercrimes that Occur on Documented P2P Networks
    The popularity of Peer-to-Peer (P2P) Internet communication technologies being exploited to aid cybercrime is ever increasing. P2P systems can be used or exploited to aid in the execution of a large number of online criminal activities, e.g., copyright infringement, fraud, malware and virus distribution, botnet creation, and control. P2P technology is perhaps most famous for the unauthorised distribution of copyrighted materials since the late 1990’s, with the popularity of file-sharing programs such as Napster. In 2004, P2P traffic accounted for 80% of all Internet traffic and in 2005, specifically BitTorrent traffic accounted for over 60% of the world’s P2P bandwidth usage. This paper outlines a methodology for investigating a documented P2P network, BitTorrent, using a sample investigation for reference throughout. The sample investigation outlined was conducted on the top 100 most popular BitTorrent swarms over the course of a one week period.
  • Publication
    Improving the accuracy of automated facial age estimation to aid CSEM investigations
    The investigation of violent crimes against individuals, such as the investigation of child sexual exploitation material (CSEM), is one of the more commonly encountered criminal investigation types throughout the world. While hash lists of known CSEM content are commonly used to identify previously encountered material on suspects’ devices, previously unencountered material requires expert, manual analysis and categorisation. The discovery, analysis, and categorisation of these digital images and videos has the potential to be significantly expedited with the use of automated artificial intelligence (AI) based techniques. Intelligent, automated evidence processing and prioritisation has the potential to aid investigators in alleviating some of the digital evidence backlogs that have become commonplace worldwide. In order for AI-aided CSEM investigations to be beneficial, the fundamental question when analysing multimedia content becomes “how old is each subject encountered?’’. Our work presents the evaluation of existing cloud-based and offline age estimation services, introduces our deep learning model, DS13K, which was created with a VGG-16 Deep Convolutional Neural Network (CNN) architecture, and develops an ensemble technique that improves the accuracy of underage facial age estimation. In addition to our model, a number of existing services including Amazon Rekognition, Microsoft Azure Cognitive Services, How-Old.net, and Deep Expectation (DEX) were used to create an ensemble learning technique. It was found that for the borderline adulthood age range (i.e., 16–17 years old), our DS13K model substantially outperformed existing services, achieving a performance accuracy of 68%. A comparative examination of the obtained results allowed us to identify performance trends and issues inherent to each service/tool and develop ensemble techniques to improve the accuracy of automated adulthood determination.
  • Publication
    Enabling non-expert analysis of large volumes of intercepted network traffic
    Telecommunications wiretaps are commonly used by law enforcement in criminal investigations. While phone-based wiretapping has seen considerable success, the same cannot be said for Internet taps. Large portions of intercepted Internet traffic are often encrypted, making it difficult to obtain useful information. The advent of the Internet of Things further complicates network wiretapping. In fact, the current level of complexity of intercepted network traffic is almost at the point where data cannot be analyzed without the active involvement of experts. Additionally, investigations typically focus on analyzing traffic in chronological order and predominately examine the data content of the intercepted traffic. This approach is overly arduous when the amount of data to be analyzed is very large. This chapter describes a novel approach for analyzing large amounts of intercepted network traffic based on traffic metadata. The approach significantly reduces the analysis time and provides useful insights and information to non-technical investigators. The approach is evaluated using a large sample of network traffic data.
      7Scopus© Citations 5
  • Publication
    Accuracy Enhancement of Electromagnetic Side-Channel Attacks on Computer Monitors
    Electromagnetic noise emitted from running computer displays modulates information about the picture frames being displayed on screen. Attacks have been demonstrated on eavesdropping computer displays by utilising these emissions as a side-channel vector. The accuracy of reconstructing a screen image depends on the emission sampling rate and bandwidth of the attackers signal acquisition hardware. The cost of radio frequency acquisition hardware increases with increased supported frequency range and bandwidth. A number of enthusiast-level, affordable software defined radio equipment solutions are currently available facilitating a number of radio-focused attacks at a more reasonable price point. This work investigates three accuracy influencing factors, other than the sample rate and bandwidth, namely noise removal, image blending, and image quality adjustments, that affect the accuracy of monitor image reconstruction through electromagnetic side-channel attacks.
      7Scopus© Citations 6
  • Publication
    Electromagnetic side-channel attacks: Potential for progressing hindered digital forensic analysis
    Digital forensics is fast-growing field involving the discovery and analysis of digital evidence acquired from electronic devices to assist investigations for law enforcement. Traditional digital forensic investigative approaches are often hampered by the data contained on these devices being encrypted. Furthermore, the increasing use of IoT devices with limited standardisation makes it difficult to analyse them with traditional techniques. This paper argues that electromagnetic side-channel analysis has significant potential to progress investigations obstructed by data encryption. Several potential avenues towards this goal are discussed.
    Scopus© Citations 13  7
  • Publication
    Expediting MRSH-v2 Approximate Matching with Hierarchical Bloom Filter Trees
    Perhaps the most common task encountered by digital forensic investigators consists of searching through a seized device for pertinent data. Frequently, an investigator will be in possession of a collection of “known-illegal” files (e.g. a collection of child pornographic images) and will seek to find whether copies of these are stored on the seized drive. Traditional hash matching techniques can efficiently find files that precisely match. However, these will fail in the case of merged files, embedded files, partial files, or if a file has been changed in any way. In recent years, approximate matching algorithms have shown significant promise in the detection of files that have a high bytewise similarity. This paper focuses on MRSH-v2. A number of experiments were conducted using Hierarchical Bloom Filter Trees to dramatically reduce the quantity of pairwise comparisons that must be made between known-illegal files and files on the seized disk. The experiments demonstrate substantial speed gains over the original MRSH-v2, while maintaining effectiveness.
    Scopus© Citations 13  6
  • Publication
    Deduplicated Disk Image Evidence Acquisition and Forensically-Sound Reconstruction
    The ever-growing backlog of digital evidence waiting for analysis has become a significant issue for law enforcement agencies throughout the world. This is due to an increase in the number of cases requiring digital forensic analysis coupled with the increasing volume of data to process per case. This has created a demand for a paradigm shift in the method that evidence is acquired, stored, and analyzed. The ultimate goal of the research presented in this paper is to revolutionize the current digital forensic process through the leveraging of centralized deduplicated acquisition and processing approach. Focusing on this first step in digital evidence processing, acquisition, a system is presented enabling deduplicated evidence acquisition with the capability of automated, forensically-sound complete disk image reconstruction. As the number of cases acquired by the proposed system increases, the more duplicate artifacts will be encountered, and the more efficient the processing of each new case will become. This results in a time saving for digital investigators, and provides a platform to enable non-expert evidence processing, alongside the benefits of reduced storage and bandwidth requirements.
      7Scopus© Citations 7
  • Publication
    Evaluation of Digital Forensic Process Models with Respect to Digital Forensics as a Service
    (Academic Conferences And Publishing International Limited, 2017-06-12) ; ;
    Digital forensic science is very much still in its infancy, but is becoming increasingly invaluable to investigators. A popular area for research is seeking a standard methodology to make the digital forensic process accurate, robust, and efficient. The first digital forensic process model proposed contains four steps: Acquisition, Identification, Evaluation and Admission. Since then, numerous process models have been proposed to explain the steps of identifying, acquiring, analysing, storage, and reporting on the evidence obtained from various digital devices. In recent years, an increasing number of more sophisticated process models have been proposed. These models attempt to speed up the entire investigative process or solve various of problems commonly encountered in the forensic investigation. In the last decade, cloud computing has emerged as a disruptive technological concept, and most leading enterprises such as IBM, Amazon, Google, and Microsoft have set up their own cloud-based services. In the field of digital forensic investigation, moving to a cloudbased evidence processing model would be extremely beneficial and preliminary attempts have been made in its implementation. Moving towards a Digital Forensics as a Service model would not only expedite the investigative process, but can also result in significant cost savings - freeing up digital forensic experts and law enforcement personnel to progress their caseload. This paper aims to evaluate the applicability of existing digital forensic process models and analyse how each of these might apply to a cloud-based evidence processing paradigm.
  • Publication
    Solid State Drive Forensics: Where Do We Stand?
    With Solid State Drives (SSDs) becoming more and more prevalent in personal computers, some have suggested that the playing field has changed when it comes to a forensic analysis. Inside the SSD, data movement events occur without any user input. Recent research has suggested that SSDs can no longer be managed in the same manner when performing digital forensic examinations. In performing forensics analysis of SSDs, the events that take place in the background need to be understood and documented by the forensic investigator. These behind the scene processes cannot be stopped with traditional disk write blockers and have now become an acceptable consequence when performing forensic analysis. In this paper, we aim to provide some clear guidance as to what precisely is happening in the background of SSDs during their operation and investigation and also study forensic methods to extract artefacts from SSD under different conditions in terms of volume of data, powered effect, etc. In addition, we evaluate our approach with several experiments across various use-case scenarios.
      6Scopus© Citations 3
  • Publication
    Deep learning at the shallow end: Malware classification for non-domain experts
    Current malware detection and classification approaches generally rely on time consuming and knowledge intensive processes to extract patterns (signatures) and behaviors from malware, which are then used for identification. Moreover, these signatures are often limited to local, contiguous sequences within the data whilst ignoring their context in relation to each other and throughout the malware file as a whole. We present a Deep Learning based malware classification approach that requires no expert domain knowledge and is based on a purely data driven approach for complex pattern and feature identification.
      7Scopus© Citations 87
  • Publication
    Digital forensic investigation of two-way radio communication equipment and services
    Historically, radio-equipment has solely been used as a two-way analogue communication device. Today, the use of radio communication equipment is increasing by numerous organisations and businesses. The functionality of these traditionally short-range devices have expanded to include private call, address book, call-logs, text messages, lone worker, telemetry, data communication, and GPS. Many of these devices also integrate with smartphones, which delivers Push-To-Talk services that make it possible to setup connections between users using a two-way radio and a smartphone. In fact, these devices can be used to connect users only using smartphones. To date, there is little research on the digital traces in modern radio communication equipment. In fact, increasing the knowledge base about these radio communication devices and services can be valuable to law enforcement in a police investigation. In this paper, we investigate what kind of radio communication equipment and services law enforcement digital investigators can encounter at a crime scene or in an investigation. Subsequent to seizure of this radio communication equipment we explore the traces, which may have a forensic interest and how these traces can be acquired. Finally, we test our approach on sample radio communication equipment and services.
      7Scopus© Citations 3
  • Publication
    Towards Quantifying the Distance between Opinions
    Increasingly, critical decisions in public policy, governance, and business strategy rely on a deeper understanding of the needs and opinions of constituent members (e.g. citizens, shareholders). While it has become easier to collect a large number of opinions on a topic, there is a necessity for automated tools to help navigate the space of opinions. In such contexts understanding and quantifying the similarity between opinions is key. We find that measures based solely on text similarity or on overall sentiment often fail to effectively capture the distance between opinions. Thus, we propose a new distance measure for capturing the similarity between opinions that leverages the nuanced observation -- similar opinions express similar sentiment polarity on specific relevant entities-of-interest. Specifically, in an unsupervised setting, our distance measure achieves significantly better Adjusted Rand Index scores (up to 56x) and Silhouette coefficients (up to 21x) compared to existing approaches. Similarly, in a supervised setting, our opinion distance measure achieves considerably better accuracy (up to 20% increase) compared to extant approaches that rely on text similarity, stance similarity, and sentiment similarity.
  • Publication
    Learning to Sparsify Travelling Salesman Problem Instances
    In order to deal with the high development time of exact and approximation algorithms for NP-hard combinatorial optimisation problems and the high running time of exact solvers, deep learning techniques have been used in recent years as an end-to-end approach to find solutions. However, there are issues of representation, generalisation, complex architectures, interpretability of models for mathematical analysis etc. using deep learning techniques. As a compromise, machine learning can be used to improve the run time performance of exact algorithms in a matheuristics framework. In this paper, we use a pruning heuristic leveraging machine learning as a pre-processing step followed by an exact Integer Programming approach. We apply this approach to sparsify instances of the classical travelling salesman problem. Our approach learns which edges in the underlying graph are unlikely to belong to an optimal solution and removes them, thus sparsifying the graph and significantly reducing the number of decision variables. We use carefully selected features derived from linear programming relaxation, cutting planes exploration, minimum-weight spanning tree heuristics and various other local and statistical analysis of the graph. Our learning approach requires very little training data and is amenable to mathematical analysis. We demonstrate that our approach can reliably prune a large fraction of the variables in TSP instances from TSPLIB/MATILDA (>85%) while preserving most of the optimal tour edges. Our approach can successfully prune problem instances even if they lie outside the training distribution, resulting in small optimality gaps between the pruned and original problems in most cases. Using our learning technique, we discover novel heuristics for sparsifying TSP instances, that may be of independent interest for variants of the vehicle routing problem.
  • Publication
    Optimal algorithms for ranked enumeration of answers to full conjunctive queries
    We study ranked enumeration of join-query results according to very general orders defined by selective dioids. Our main contribution is a framework for ranked enumeration over a class of dynamic programming problems that generalizes seemingly different problems that had been studied in isolation. To this end, we extend classic algorithms that find the k-shortest paths in a weighted graph. For full conjunctive queries, including cyclic ones, our approach is optimal in terms of the time to return the top result and the delay between results. These optimality properties are derived for the widely used notion of data complexity, which treats query size as a constant. By performing a careful cost analysis, we are able to uncover a previously unknown tradeo ff between two incomparable enumeration approaches: one has lower complexity when the number of returned results is small, the other when the number is very large. We theoretically and empirically demonstrate the superiority of our techniques over batch algorithms, which produce the full result and then sort it. Our technique is not only faster for returning the first few results, but on some inputs beats the batch algorithm even when all results are produced.
    Scopus© Citations 13  6