Now showing 1 - 10 of 103
  • Publication
    Formalization of SLAs for Cloud Forensic Readiness
    Cloud Computing is one of the most pervasive ICT changes of the last few years. Usually, Clouds offer a variety of Services, which are accessible over the Internet. These Services are regulated by some contracts called Service Level Agreements between Service providers and customers. The SLAs have already been introduced in Service Oriented Architectures in situations where some computing services need to be structured and regulated. In an SLA, the constraints of use, the duties and responsibilities of the parties involved, the charges and the service levels to guarantee, etc., are clearly stated by dedicated clauses. Despite the efforts made in systems security and the standardisation of SLAs, Cloud Services continues to suffer from various cybercriminal attacks, and unfortunately this phenomenon is likely to escalate within the next few years. It becomes urgent to take some countermeasures against these illegal practices to increase both the customer trust and quality of services of such new technologies. One of the alternatives for this phenomenon is to provide an efficient cloud Forensic Readiness System (FRS) to prevent and alert the provider and/or customer of any suspect attacks or strange behaviour. Much attention has been given to FRSs and they have certainly moved from simple log files and monitoring to very sophisticated components involving both human experts and computer analysis tools. In this paper we study the effect of SLAs on FRSs. As SLAs may be different from one jurisdiction to another we believe that FRSs should also comply with jurisdiction for more efficiency and speed of isolating and resolving forensic cases. Therefore, we propose an FRS that takes into account automatically SLAs and issue warnings and alerts to its users (providers and consumers) based on the jurisdiction and the nature of security breach and attacks. These SLAs are presented to the system as a set of rules (clauses). This will also prevent illegal data transfers and communications among different jurisdictions. Part of this paper will be dedicated to the formalisation of these SLAs and study its consequences on the FRS architecture and functioning. The rest of the paper will be dedicated to the design and development of the FRS reference architecture integrating the proposed SLA formal model.
      1211
  • Publication
    Overview of the Forensic Investigation of Cloud Services
    Cloud Computing is a commonly used, yet ambiguous term, which can be used to refer to a multitude of differing dynamically allocated services. From a law enforcement and forensic investigation perspective, cloud computing can be thought of as a double edged sword. While on one hand, the gathering of digital evidence from cloud sources can bring with it complicated technical and cross-jurisdictional legal challenges. On the other, the employment of cloud storage and processing capabilities can expedite the forensics process and focus the investigation onto pertinent data earlier in an investigation. This paper examines the state-of-the-art in cloud-focused, digital forensic practises for the collection and analysis of evidence and an overview of the potential use of cloud technologies to provide Digital Forensics as a Service.
    Scopus© Citations 28  914
  • Publication
    Toward a new approach for massive LiDAR data processing
    Laser scanning (also known as Light Detection And Ranging) has been widely applied in various application. As part of that, aerial laser scanning (ALS) has been used to collect topographic data points for a large area, which triggers to million points to be acquired. Furthermore, today, with integrating full wareform (FWF) technology during ALS data acquisition, all return information of laser pulse is stored. Thus, ALS data are to be massive and complexity since the FWF of each laser pulse can be stored up to 256 samples and density of ALS data is also increasing significantly. Processing LiDAR data demands heavy operations and the traditional approaches require significant hardware and running time. On the other hand, researchers have recently proposed parallel approaches for analysing LiDAR data. These approaches are normally based on parallel architecture of target systems such as multi-core processors, GPU, etc. However, there is still missing efficient approaches/tools supporting the analysis of LiDAR data due to the lack of a deep study on both library tools and algorithms used in processing this data. In this paper, we present a comparative study of software libraries and new algorithms to optimise the processing of LiDAR data. We also propose new method to improve this process with experiments on large LiDAR data. Finally, we discuss on a parallel solution of our approach where we integrate parallel computing in processing LiDAR data.
    Scopus© Citations 11  694
  • Publication
    CupCarbon: A Multi-Agent and Discrete Event Wireless Sensor Network Design and Simulation Tool
    (Institute for Computer Science, Social Informatics and Telecommunications Engineering (ICST), 2014-03-19) ; ; ;
    This paper presents the first version of a Wireless Sensor Network simulator, called CupCarbon. It is a multi-agent and discrete event Wireless Sensor Network (WSN) simulator. Networks can be designed and prototyped in an ergonomic user-friendly interface using the OpenStreetMap (OSM) framework by deploying sensors directly on the map. It can be used to study the behaviour of a network and its costs. The main objectives of CupCarbon are both educational and scientific. It can help trainers to explain the basic concepts and how sensor networks work and it can help scientists to test their wireless topologies, protocols, etc. The current version can be used only to study the power diagram of each sensor and the overall network. The power diagrams can be calculated and displayed as a function of the simulated time. Prototyping networks is more realistic compared to existing simulators.
      1397Scopus© Citations 97
  • Publication
    Mining Spatio-temporal Data at Different Levels of Detail
    In this paper we propose a methodology for mining very large spatio-temporal datasets. We propose a two-pass strategy for mining and manipulating spatio-temporal datasets at different levels of detail (i.e., granularities). The approach takes advantage of the multi-granular capability of the underlying spatio-temporal model to reduce the amount of data that can be accessed initially. The approach is implemented and applied to real-world spatio-temporal datasets. We show that the technique can deal easily with very large datasets without losing the accuracy of the extracted patterns, as demonstrated in the experimental results.
    Scopus© Citations 6  1102
  • Publication
    Classical Mechanics Optimization for image segmentation
    In this work, we focus on image segmentation by simulating the natural phenomenon of the bodies moving through space. For this, a subset of image pixels is regularly selected as planets and the rest as satellites. The attraction force is defined by Newton’s third law (gravitational interaction) according to the distance and color similarity. In the first phase of the algorithm, we seek an equilibrium state of the earth-moon system in order to achieve the second phase, in which we search an equilibrium state of the earth-apple system. As a result of these two phases, bodies in space are constructed; they represent segments in the image. The objective of this simulation is to find and then extract the multiple segments from an image.
    Scopus© Citations 2  281
  • Publication
    An efficient customer search tool within an anti-money laundering application implemented on an internaitonal bank's dataset
    Today, money laundering (ML) poses a serious threat not only to financial institutions but also to the nations. This criminal activity is becoming more and more sophisticated and seems to have moved from the cliché of drug trafficking to financing terrorism and surely not forgetting personal gain. Most of the financial institutions internationally have been implementing anti-money laundering solutions (AML) to fight investment fraud activities. In AML, the customer identification is an important task which helps AML experts to monitor customer habits: some being customer domicile, transactions that they are involved in etc. However, simple query tools provided by current DBMS as well as naive approaches in customer searching may produce incorrect and ambiguous results and their processing time is also very high due to the complexity of the database system architecture. In this paper, we present a new approach for identifying customers registered in an investment bank. This approach is developed as a tool that allows AML experts to quickly identify customers who are managed independently across separate databases. It is tested on real-world datasets, which are real and large financial datasets. Some preliminary experimental results show that this new approach is efficient and effective.
      165
  • Publication
    Reference Architecture for a Cloud Forensic Readiness System
    The Digital Forensic science is participating to a brand new change represented by the management of incidents in the Cloud Computing Services. Due that the Cloud Computing architecture is uncontrollable because of some specific features,its use to commit crimes is becoming a very critical issue, too. Proactive Cloud Forensics becomes a matter of urgency, due to its capability of collecting critical data before crimes happen, thus saving time and money for the subsequent investigations. In this paper, a proposal for a Cloud Forensic Readiness System is presented. It is conceived as reference architecture, in order to be of general applicability, not technically constrained by any Cloud architecture. The principal aim of this work is to extend our initial proposed Cloud Forensic Readiness System reference architecture, by providing more details and an example of its application by exploiting the Open Stack Cloud Platform.
      1820
  • Publication
    Simulating SQL-Injection Cyber-attacks using GNS3
    (International Journal of Computer Theory and Engineering, 2015-02-13) ; ; ;
    Network Forensics is a subtopic of Digital Forensics wherein research on artificat investigations and intrusions evidence acquisition is addressed. Among many challenges in the field, the problem of losing data artifacts in the state of flux, (i.e., live volatile data), when network devices are suddenly non-operational remains a topic of interest to many investigators. The main objective of this article is to simulate an SQL injection attack scenarios in a complex network environment. We designed and simulated a typical Demilitarized Zone (DMZ) network environment using Graphical Network Simulator (GNS3), Virtual Box and VMware workstation. Using this set-up we are now able to simulate specific network devices configuration, perform SQL injection attacks against victim machines and collect network logs. The main motivation of our work is to finally define an attack pathway prediction methodology that makes it possible to examine the network artifacts collected in case network attacks.
      1217
  • Publication
    E-government Alerts Correlation Model
    Qatars IT infrastructure is rapidly growing to encompass the evolution of businesses and economical growth the country is increasingly witnessing throughout its industries. It is now evident that the countrys e-government requirements and associated data management systems are becoming large in number, highly dynamic in nature, and exceptionally attractive for cybercrime activities. Protecting the sensitive data e-government portals are relying on for daily activities is not a trivial task. The techniques used to perform cybercrimes are becoming sophisticated relatively with the firewalls protecting them. Reaching high-level of data protection, in both wired and wireless networks, in order to face recent cybercrime approaches is a challenge that is continuously proven hard to achieve.In a common IT infrastructure, the deployed network devices contain a number of event logs that reside locally within its memory. These logs are in large numbers, and therefore, analyzing them is a time consuming task for network administrators. In addition, a single network event often generates a redundancy of similar event logs that belong to the same class within short time intervals. The large amount of redundancy logs makes it difficult to manage them during forensics investigation. In most cybercrime cases, a single alert log does not contain sufficient information about malicious actionsbackground and invisible network attackers. The information for a particular malicious action or attacker is often distributed among multiple alert logs and among multiple network devices. Forensic investigators mission is to detect malicious activities and reconstruct incident scenarios is now very complex considering the number as well as the quality of these event logs.
      355