Dr. Belkacem Chaouchi
Department of Mathematics, Faculty of Sciences, Khemis Miliana University, Algeria.


ISBN 978-93-91882-04-4 (Print)
ISBN 978-93-91882-19-8 (eBook)
DOI: 10.9734/bpi/ctmcs/v10


This book covers key areas of mathematics and computer science. The contributions by the authors include entropy coding, Huffman encoding, leveling, lossless data compression, data Mining, land transport infrastructure, machine learning, sentiments analysis, fuzzy primes, primary submodules, radical, fuzzy submodules, orthogonal design, nearly orthogonal design, super saturated design, Hadamard matrix, D-optimality, J2-optimality, fuzzy control, multistage fuzzy control, deterministic object, stochastic object, fuzzy object, stochastic-fuzzy knowledge base, formal Concept Analysis, cryptography, integer factorization, binary tree, algorithm, virtual reality, augmented reality, oculus Rift, leap motion, model driven engineering, object constraint language, mobile software, agility, agile information system, IT agility, IS agility, agile enterprise, agile organization, agile BSC, balanced scorecard, elliptic boundary value problem, finite element scheme with discontinuity, aseismic ground deformation, Strike slip fault, linearly viscoelastic medium. This book contains various materials suitable for students, researchers and academicians in the field of mathematics and computer science.


Media Promotion:


An Efficient VLC for Lossless Data Compression: Leveling

Javier Joglar Alcubilla

Current Topics on Mathematics and Computer Science Vol. 10, 30 August 2021, Page 1-25

Many standard compression methods are based, at their most basic level, on coding digital words of variable length, or VLC, Huffman type, designed in 1952, or any of its variants, as canonical.  This article describes a VLC called "Leveling" that is more efficient than Huffman encoding and employs two variants: "Leveled Reordering" for low redundancy, such as text, and "Segmented Leveling" for moderate and high redundancy, such as image processing. Leveling, invented by Javier Joglar in 1995, employs the notions of "meaning" and "ordering" of the VLC codes generated to achieve the highest "compression ratio" of any non-adaptive VLC.

Sentiments Analysis on Public Land Transport Infrastructure in Davao Region using Machine Learning Algorithms: A Recent Study

Mark Van M. Buladaco, Jumar S. Buladaco, Laarni M. Cantero

Current Topics on Mathematics and Computer Science Vol. 10, 30 August 2021, Page 26-34

Land transport infrastructure has always been an important aspect of a city. People increasingly use social media to express their feelings about city projects such as land transportation. Government organisations are having trouble detecting concerns that develop as a result of people utilising social media to criticise land transportation infrastructure. These text-based social media posts can be examined using sentiment analysis, which is a key task in Natural Language Processing (NLP). Sentiment analysis is a process of the deriving sentiment of a particular statement or sentence. This study used a social networking website to create a model of sentiments on land transportation infrastructure in Region XI (Davao Region), the Philippines, and tested the model's correctness using a data set. There are 1,200 text data sets in total, divided into two categories: test dataset (25 percent) and training dataset (75 percent). To perform sentiments analysis of text data sets, machine learning text classifiers such as Support Vector Machines (SVM), Random Forest (RF), and Multinomial Nave Bayesian (MNB) were utilised.  The performance of each classification model is estimated by generating a confusion metric with precision and recall calculations, known as the f1-score. The accuracy rating was calculated as well. A comparison was also made based on the results of the three machine learning classifiers' tests.  SVM has the highest accuracy, with 76.12 percent and a f1-score of 71.98 percent, based on the results of the experiment. This research will be used to inform and support policy-making and development of land transportation infrastructure in the Davao Region.

Study on Fuzzy Primes and Primary Submodules

Pratibha Kumar

Current Topics on Mathematics and Computer Science Vol. 10, 30 August 2021, Page 35-45

In this chapter the author attempts to fuzzify the concepts of prime, primary submodules and the radical of a fuzzy submodule. These concepts are studied in terms of their level submodules. Some properties of quotient of a fuzzy submodule and the intersection of two fuzzy primary submodules are discussed. The algebraic nature of fuzzy prime submodules and radical of fuzzy prime submodule is also carried out. Finally, fuzzy cosets of fuzzy submodules are also defined.

Orthogonal arrays such as factorial and fractional factorial designs of experimental plans are used for identifying important factors to improve quality of an experiment. Super Saturated Designs are very cost-effective in the stage of scientific investigations. Nearly-Orthogonal Arrays that can construct a variety of small-run designs with different levels have good statistical properties. In the present paper Super Saturated Design and Nearly Orthogonal Design are constructed with Orthogonal Design. It is a great deal of interest in the development of factor screening experiments that are optimal or highly efficient under the E (s2) and J2 criterion. We focus on finding combinatorial solution of the experiment. We proposed a class of special super saturated design using Hadamard design on thalassemic children’s data and constructed mixed level orthogonal arrays and nearly orthogonal arrays. 

The main purpose of this chapter is to  propose  a mathematical method to fuzzy control of the stochastic-fuzzy object. The proposed hybrid method uses both the stochastic–fuzzy knowledge base describing the object under control, as well as the large scale data to calculate probabilities of fuzzy conditional statements in the knowledge base.

This study includes: the  presentation of multistage fuzzy optimization methods with reference to the dynamical discrete systems represented by deterministic, stochastic and fuzzy models (paragraph 2); the proposition of modeling stochastic-fuzzy object under control in the form of knowledge base (paragraph 3). Exemplary calculations illustrate the theoretical description.

In theory of fuzzy concept lattice, generating fuzzy concepts from a given data with fuzzy attributes is one of the fundamental problem. Since fuzzy concepts are the fixpoints of a particular fuzzy operator that is associated with input data, the problem of generating fuzzy concepts turn out to be the problem of computing all fixpoints of this operator. In this article, we have established ten rules for generating fixpoints of two fuzzy closure operators, \(\uparrow\downarrow\) and \(\downarrow\uparrow\). Then unifying all the proposed rules, we present a new method and algorithm for computing fixpoints (fuzzy concepts) which are defined as min-generated fuzzy concepts.

Fast Approach to Factorize Odd Integers: An algorithm Based Study

Xingbo Wang, Junjian Zhong

Current Topics on Mathematics and Computer Science Vol. 10, 30 August 2021, Page 80-98

The paper proves that an odd composite integer N can be factorized in O ((log2N)4) bit operations if N = pq, the divisor q is of the form 2\(\alpha\)u +1 or 2\(\alpha\)u-1 with u being an odd integer and \(\alpha\) being a positive integer and the other divisor p satisfies 1 < p \(\leq\) 2\(\alpha\) +1 or 2\(\alpha\) +1 < p \(\leq\) 2\(\alpha\)+1-1. Theorems and corollaries are proved with detail mathematical reasoning. Algorithm to factorize the odd composite integers is designed and tested in Maple. The results in the paper demonstrate that fast factorization of odd integers is possible with the help of valuated binary tree.

Training Application for Industries by Means of Virtual Reality Technology: A Recent Study

Mihalache Ghinea, Gicu Calin Deac, Crina Narcisa Georgescu

Current Topics on Mathematics and Computer Science Vol. 10, 30 August 2021, Page 99-111

One of the pylons of the industry 4.0 is augmented and virtual reality. It can improve the perception on the industrial processes in real time and more. The purpose of this research is to obtain a starting point regarding the actual abilities to enlarge the use of VR HMDs (e.g.: HTC Vive, Oculus Rift, etc.), assembling into a unique application, Virtual and Augmented Reality with Gesture Interaction. The project relies on the Leap Motion controller. This controller has an integrated infrared depth camera which can be used as an input video feed for stereoscopic view on Oculus Rift, at the same time, projecting the raw images of your hands into a 3D mesh that can interact with other objects in real-world space (in our case a 3D model of a complex product). The pass through from the twin Leap Motion cameras covers the part of the mesh such that each camera assures a distinct view of the hands, letting you see the actual depth. We can interact with the 3D model and show some functional animations. We present a part of an industrial application of VR used into a common industry (door locks manufacturing industry), which can be useful for research, training, and advertising too.

Encouraging the use of Object Constraint Language Rules to Facilitate the Creation of Mobile Applications

Jean Carlos Hrycyk, Inali Wisniewski Soares, Luciane Telinski Wiedermann Agner

Current Topics on Mathematics and Computer Science Vol. 10, 30 August 2021, Page 112-120

The increasing use of mobile devices encourages mobile app development. However, the competitive market demands quality software in reduced time. This research encourages the use of Object Constraint Language rules to support the development of mobile application models in Model Driven Engineering.

An Adaptive Approach to Design, Measure and Improve IT Agility: ITAAM Framework

Yassine Rdiouat, Wafaa Dachry, Alami Semma

Current Topics on Mathematics and Computer Science Vol. 10, 30 August 2021, Page 121-134

Information systems are compelled to adapt by developing agility abilities in order to respond effectively to rapid and unforeseen changes in the business environment, so the organization can grow or even survive in this changing environment. As a result, organizations must be given guidance on how to design, measure, and enhance the agility of their information systems. An iterative, integrated, flexible, and balanced framework has been established in this area. The ITAAM (Information Technology Agility Assessment Model) framework is based on a simple model developed from the Balanced ScoreCard approach, which is extensively used as a performance measuring system. Our approach is made up of a conceptual framework, an evaluation methodology, an agility grid, and a scoring system to compute the Global Agility Index. Furthermore, throughout each evaluation cycle, the agility level is detected, and a set of recommendations, as well as the appropriate organizational adjustment guidelines, are offered. As a result, we can ensure long-term viability and ongoing agility improvement, which is the primary purpose of our new structure. We believe that the framework can be adapted and generalized by sector of activities or by group of sectors that have common concerns and almost the same issues (level of competition, competitiveness, changes, etc.).

Numerical Modelling of Aseismic Ground Deformation Problem

Subhash Chandra Mondal, Suma Debsarma

Current Topics on Mathematics and Computer Science Vol. 10, 30 August 2021, Page 135-152

For problems involving aseismic ground deformation in seismically active areas, numerical approaches based on the finite element scheme with discontinuity have been developed. We emphasise the utility of such numerical techniques in tackling geodynamics problems. In a viscoelastic half space representing the lithosphere-asthenosphere system, a long strike-slip fault is considered. Under the influence of tectonic forces caused by mantle convection, the fault moves abruptly. The ensuing boundary value issues were solved using a numerical technique based on a finite element scheme with proper boundary conditions that was developed specifically for the task. The numerical algorithms described here can be tweaked to solve more general deformation issues where analytical methods become too complex. Appropriate modifications can also be incorporated in our model in case when the tectonic forces are not anti-symmetric in nature as well as fault plane is not vertical.