Department of Computer Science: Recent submissions

  • Eid, Fadi (Notre Dame University-Louaize, 2005)
    This thesis tackles Performability issues in Wireless sensors networks. Performability is a mission-specific measure of system effectiveness that seeks to combine the traditional reliability and performance measures of a system. Wireless Sensor networks consist of a huge number of small sensor nodes, which communicate wirelessly. These sensor nodes can be spread out in hard accessible areas by what new applications fields can be pointed out [13]. This thesis aims to provide wireless communication architecture for Petroleum installations, such as off shore platforms and on shore processing units, ...
  • Bechara, Reine (Notre Dame University-Louaize, 2005)
    Enterprise resource planning – ERP – is among the latest technologies that companies have undertaken. Typically it is a software package that has a centralized database for several modules; these modules could be customized based on the organization’s needs. Since the cost of an ERP implementation is considered to be very high, it is critical for organizations to make it a success and start having return on their investment. But what makes an ERP implementation project successful? How to evaluate its success? These are the questions addressed by the thesis. Answering these questions, a research ...
  • Ghanem, Pascale Y. (Notre Dame University-Louaize, 2001)
    The global Internet has experienced many years of sustained exponential growth doubling in size every nine months or faster [8]. Millions of users at tens of thousands of sites around the world depend on the global Internet as part of their daily work environment. This massive use of the Internet as well as the continuous interconnection of new groups arises many problems such as: packet loss, network congestion, insufficient bandwidth, increase in delay... In this thesis, we focus mainly on the problem of communication delay and bandwidth allocation. Our main goal is to find a way to minimize ...
  • Boutros, Rania A. (Notre Dame University-Louaize, 1999)
    A well known problem with wormhole-routed packet networks is the potentially large amount of blocking that packets can experience due to link contention. Because of the very limited amount of buffering in such networks, blocked packets remain in the network and keep using network resources. Thus, blocked packets may in turn cause other packets to be blocked. This may affect a large number of packets over a large portion of the network. Proper connection management strategies and appropriate protocols must be devised to ensure that blocking of packets due to link contention is bounded. In [3], ...
  • Balian, Armen A. (Notre Dame University-Louaize, 1999)
    In this thesis, automatic segmentation of liver veins from ultrasound slices is implemented and further rendered into 3D image to assist the medical doctor to more accurately apply any study intended on the liver. The main part of this study is the texture analysis since it is the hardest problem in common. As a start, 4 different techniques in texture analysis have been introduced, the Spatial Gray-level Dependence Matrices (SGLDM), The Fourier Power Spectrum (FPS), The Gray- Level Difference Statistics (GLD), and Law's Texture Energy Measure (TEM). Each of these methods have been discussed ...
  • Karaa, George Emile (Notre Dame University-Louaize, 2001)
    This thesis addresses the problem of analysis the Web hyperlink structure in order to locate authoritative web pages relevant to a certain query subject. In particular, we discuss the HITS algorithm, a pioneering method that exploits the web link structure in order to locate important information sources, as well as other algorithms based on HITS such as ARC and Page Rank. We experiment with different aspects of HITS, and introduce new heuristics for assigning the link weights based on textual contexts.
  • Akhras, Chukri (Notre Dame University-Louaize, 2003)
    Administration of access control was and still is a crucial, critical and complex aspect of Security Administration. Many models were developed and used to effect this administration such as Mandatory access Control (MAC), Discretionary Access Control (DAC) and Role Base access Control. the latter, RBAC which is a flexible and policy-independent access control, represents a natural structure of an organization where functions are grouped into roles and users are permitted to one or more of these roles. In large organizations with relatively large systems, with hundreds of roles and users and ...
  • Hatoum, Oussama (Notre Dame University-Louaize, 2004)
    This thesis aim is defining a new meaning for microprogramming as we know it. Microprogramming was seen from two different views. In commercial applications microprogramming was treated primarily as an emulation tool. This allowed systems such as the IMB system/386 to implement both backward and forward compatibility. In military applications microprograming was considered as a tool for building optimized and fault tolerance control units. Dynamic and user microprogramming are usually used to refer to the same concept. In this thesis we will make a clear distinction between the two terms. User ...
  • Bouorm, Ramez (Notre Dame University-Louaize, 2005)
    The classic way for communication between database and applications is performed using ODBC (for standard applications) and DLL files (for distributed applications). The job of theses API's is to connect to database for sending and retrieving information. We introduce the WDBC pattern that is designed to do the same tasks but in addition, it guarantees the safety of data (by doing backup of database over the network) and can be integrated in an application that is created with different programming languages and running on different operating systems.
  • Farah, Fahed E. (Notre Dame University-Louaize, 2005)
    The purpose of this thesis is to discuss the methods used to evaluate open source software in content management systems, and take a real case using open source content management software and illustrate the problem, evaluate the software, apply the evaluation methods and study its features. The software in this case is called Zope, which is an open source software application server used to design and build content management systems, intranets, portals and applications that are traditionally used. This software is written in the Python language, one of the most highly efficient object oriented ...
  • Zeaiter, Leonardo (Notre Dame University-Louaize, 2005)
    In this thesis, we demonstrate how developers can make use of the model driven architecture (MDA) techniques to make the software development process more reliable, robust and less costly. We will focus on discussing a set of transformation techniques that together, transform a PSM (Platform Specific Model) into a code model written in Visual Basic.NET and an ER (Entity Relationship) model developed using SQL (Structured query Language) and targeting the Microsoft SQL Server relational database management system. The source model is assumed to be represented using UML (Unified Modeling Language). ...
  • Hashem, Jinane N. (Notre Dame University-Louaize, 2005)
    Metadata is considered a very useful and valuable component in handing data and delivering decisional information, in managing contents on the Internet, in databases and warehouses, in enabling the finding of relevant information, and in effectively providing a multiplicity of useful services. Data, in point of facts, can be metaphorically compared to a dead substance; and all acknowledge the fact that data, by merely existing in systems, represents almost nothing. However, it is the awareness of the exact length and value of this data that builds the valuable expensive information. This thesis ...
  • Aad, Elie (Notre Dame University-Louaize, 2002)
    Mining association rules has been an important topic in data mining research in recent years from the standpoint of supporting human-centered discovery of knowledge. The present day model of mining association rules suffers from the following shortcomings: (i) lack of user exploration and control, (ii) lack of focus, (iii) huge number of rules technically unreadable. All data mining researchers, have given a high importance on developing fast algorithms for rules discovery, and have applied different types of constraints in different algorithms to prune item sets, that do not occur frequently ...
  • Abboud, Joelle P. (Notre Dame University-Louaize, 2005)
    Internet banking is somehow a new technology used by banks to interact with their customers providing them with a wide range of services available online. These services can be classified into 5 categories: informational, administrative, transactional, portal and others. However with these come new challenges, risks and threats when adapting internet banking that are either inherited from the internet or that are specific to the open banking environment. Therefore, specific security requirements to the internet banking environment should be adopted aiming to control these risks and manage ...
  • Soueidy, Amine (Notre Dame University-Louaize, 2001)
    This thesis discusses the topic of learning in a complex environment. Both the historical basis of the field and a broad selection of the current works are summarized. Reinforcement learning is the problem faced by an agent that learns behavior through trial-and-error interactions with a dynamic environment. The work described here has a resemblance to work in psychology, but considerably in the details and in the use of the word "reinforcement". We describe the foundations of new field of reinforcement learning named artificial economy. We experiment with a prototype based on the artificial ...
  • El Daccache, Walid (Notre Dame University-Louaize, 2001)
    The World Wide Web is considered by far to be a simple and universal standard for exchanging information. This information that started to be manually composed and human readable, needed to grow more to reach dynamic composition and machine readability. Static web documents starved for a light breeze to release it from data and presentation mixture and lift it to a higher level of data representation and organization. XML "the extensible markup language'' as a standard of data representation on the web offered flexibility and simplicity in representing self describing electronic documents. This ...
  • Jurascovitch, Elie M. (Notre Dame University-Louaize, 2002)
    The purpose of this thesis is to study distributed applications in the context of heterogeneous clients (Web browser, mobile) and business-to-business (B2B) integration. Two distributed architectures, leaders of the distributed applications market are presented and compared: Microsoft's .NET, and Sun's Java Enterprise Java Beans (J2EE). After this comparative study, we chose .NET to develop an e- commerce application (Jurasco Style electronic shop) in order to outline the major application architectural issues and solutions offered by .NET technology. Our study also shows how the .NET Jurasco ...
  • Mouawad, Maurice T. (Notre Dame University-Louaize, 2001-06)
    The web is a vast source of information. However, due to the divisiveness and unlikeness of web pages' contents, this information is covered in the chaotic structure of the World Wide Web. At the same time, with the spread of web access, search engines are being, if not the sole utility, one of the most mechanisms used by the increasing number of users, to find interesting information. We are interested in identifying how pieces of information represented by URL pages, sharing common topics, are related as they are represented on the web. One such problem is studying patterns of occurrences of ...
  • Dhaini, Bassel H. (Notre Dame University-Louaize, 2004)
    The aim of data mining as a scientific research is developing methods to analyze large amounts of data in order to discover interesting regularities or exceptions. Typical problems, which should be resolved during developing effective data mining algorithms, arise from the large sizes of both: The data sets used in the data mining process and the patterns results sets (for example in rules) which form discovered knowledge. Scientific researchers are oriented to find the most advantageous (i.e. most effective) solutions both during the data preparation stage and exploration and finally post- ...
  • Mouawad, Pauline (Notre Dame University-Louaize, 2004)
    This thesis is motivated by the latest work related to differential compression algorithms as it appears in the work of Ajtai, et al. - 2002. In particular, we pay special attention to delta encoding algorithms that achieve got compression in linear time and constant space. This is important because previous work in this area uses either quadratic time and constant space or linear time and linear space, which is unacceptable for large inputs. In delta encoding, the algorithm reads two different copies of the same file as input, termed the reference copy and the version copy. The output of the ...

Search DSpace


Advanced Search

Browse

My Account