Sign in to use this feature.

Years

Between: -

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,002)

Search Parameters:
Journal = Computers

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
14 pages, 283 KiB  
Article
A Comparative Study of Commit Representations for JIT Vulnerability Prediction
Computers 2024, 13(1), 22; https://doi.org/10.3390/computers13010022 - 11 Jan 2024
Viewed by 240
Abstract
With the evolution of software systems, their size and complexity are rising rapidly. Identifying vulnerabilities as early as possible is crucial for ensuring high software quality and security. Just-in-time (JIT) vulnerability prediction, which aims to find vulnerabilities at the time of commit, has [...] Read more.
With the evolution of software systems, their size and complexity are rising rapidly. Identifying vulnerabilities as early as possible is crucial for ensuring high software quality and security. Just-in-time (JIT) vulnerability prediction, which aims to find vulnerabilities at the time of commit, has increasingly become a focus of attention. In our work, we present a comparative study to provide insights into the current state of JIT vulnerability prediction by examining three candidate models: CC2Vec, DeepJIT, and Code Change Tree. These unique approaches aptly represent the various techniques used in the field, allowing us to offer a thorough description of the current limitations and strengths of JIT vulnerability prediction. Our focus was on the predictive power of the models, their usability in terms of false positive (FP) rates, and the granularity of the source code analysis they are capable of handling. For training and evaluation, we used two recently published datasets containing vulnerability-inducing commits: ProjectKB and Defectors. Our results highlight the trade-offs between predictive accuracy and operational flexibility and also provide guidance on the use of ML-based automation for developers, especially considering false positive rates in commit-based vulnerability prediction. These findings can serve as crucial insights for future research and practical applications in software security. Full article
Show Figures

Figure 1

23 pages, 9993 KiB  
Article
Advanced Road Safety: Collective Perception for Probability of Collision Estimation of Connected Vehicles
Computers 2024, 13(1), 21; https://doi.org/10.3390/computers13010021 - 09 Jan 2024
Viewed by 242
Abstract
In the dynamic landscape of vehicular communication systems, connected vehicles (CVs) present unprecedented capabilities in perception, cooperation, and, notably, probability of collision management. This paper’s main concern is the collision probability of collision estimation. Achieving effective collision estimation heavily relies on the sensor [...] Read more.
In the dynamic landscape of vehicular communication systems, connected vehicles (CVs) present unprecedented capabilities in perception, cooperation, and, notably, probability of collision management. This paper’s main concern is the collision probability of collision estimation. Achieving effective collision estimation heavily relies on the sensor perception of obstacles and a critical collision probability prediction system. This paper is dedicated to refining the estimation of collision probability through the intentional integration of CV communications, with a specific focus on the collective perception of connected vehicles. The primary objective is to enhance the understanding of the potential probability of collisions in the surrounding environment by harnessing the collective insights gathered through inter-vehicular communication and collaboration. This improvement enables a superior anticipation capacity for both the driving system and the human driver, thereby enhancing road safety. Furthermore, the incorporation of extended perception strategies holds the potential for more accurate collision probability estimation, providing the driving system or human driver with increased time to react and make informed decisions, further fortifying road safety measures. The results underscore a significant enhancement in collision probability awareness, as connected vehicles collectively contribute to a more comprehensive collision probability landscape. Consequently, this heightened collective collision probability perception improves the anticipation capacity of both the driving system and the human driver, contributing to an elevated level of road safety. For future work, the exploration of our extended perception techniques to achieve real-time probability of collision estimation is proposed. Such endeavors aim to drive the development of robust and anticipatory autonomous driving systems that truly harness the benefits of connected vehicle technologies. Full article
(This article belongs to the Special Issue Cooperative Vehicular Networking 2023)
Show Figures

Figure 1

37 pages, 8647 KiB  
Article
Forecasting of Bitcoin Illiquidity Using High-Dimensional and Textual Features
Computers 2024, 13(1), 20; https://doi.org/10.3390/computers13010020 - 09 Jan 2024
Viewed by 301
Abstract
Liquidity is the ease of converting an asset (physical/digital) into cash or another asset without loss and is shown by the relationship between the time scale and the price scale of an investment. This article examines the illiquidity of Bitcoin (BTC). Bitcoin hash [...] Read more.
Liquidity is the ease of converting an asset (physical/digital) into cash or another asset without loss and is shown by the relationship between the time scale and the price scale of an investment. This article examines the illiquidity of Bitcoin (BTC). Bitcoin hash rate information was collected at three different time intervals; parallel to these data, textual information related to these intervals was collected from Twitter for each day. Due to the regression nature of illiquidity prediction, approaches based on recurrent networks were suggested. Seven approaches: ANN, SVM, SANN, LSTM, Simple RNN, GRU, and IndRNN, were tested on these data. To evaluate these approaches, three evaluation methods were used: random split (paper), random split (run) and linear split (run). The research results indicate that the IndRNN approach provided better results. Full article
(This article belongs to the Special Issue Uncertainty-Aware Artificial Intelligence)
Show Figures

Figure 1

35 pages, 14642 KiB  
Article
Faraway, so Close: Perceptions of the Metaverse on the Edge of Madness
Computers 2024, 13(1), 19; https://doi.org/10.3390/computers13010019 - 08 Jan 2024
Viewed by 342
Abstract
With the evolution of technologies, virtual reality allows us to dive into cyberspace through different devices and have immersive experiences in different contexts, which, in a simple way, we call virtual worlds or multiverse (integrating Metaverse versions). Through virtual reality, it is possible [...] Read more.
With the evolution of technologies, virtual reality allows us to dive into cyberspace through different devices and have immersive experiences in different contexts, which, in a simple way, we call virtual worlds or multiverse (integrating Metaverse versions). Through virtual reality, it is possible to create infinite simulated environments to immerse ourselves in. Future internet may be slightly different from what we use today. Virtual immersion situations are common (particularly in gaming), and the Metaverse has become a lived and almost real experience claiming its presence in our daily lives. To investigate possible perspectives or concepts regarding the Metaverse, virtual reality, and immersion, we considered a main research question: To what extent can a film centered on the multiverse be associated with adults’ Metaverse perceptions? Considering that all participants are adults, the objectives of this study are: (1) Verify the representations of the Metaverse; (2) Verify the representations of immersion; (3) Verify the representations of the multiverse; (4) Verify the importance of a film (related to the Metaverse and the multiverse) on the representations found. This study—framed in a Ph.D. research project—analyzed the participants’ answers through an online survey using two films to gather thoughts, ideas, emotions, sentiments, and reactions according to our research objectives. Some limitations were considered, such as the number of participants, number of the questionnaire questions and the knowledge or lack of the main concepts. Our results showed that a virtual world created by a movie might stimulate the perception of almost living in that supposed reality, accepting the multiverse and Metaverse not as distant concepts but as close experiences, even in an unconscious form. This finding is also a positive contribution to a discussion in progress aiming for an essential understanding of the Metaverse as a complex concept. Full article
Show Figures

Figure 1

18 pages, 889 KiB  
Article
Mining Negative Associations from Medical Databases Considering Frequent, Regular, Closed and Maximal Patterns
Computers 2024, 13(1), 18; https://doi.org/10.3390/computers13010018 - 08 Jan 2024
Viewed by 256
Abstract
Many data mining studies have focused on mining positive associations among frequent and regular item sets. However, none have considered time and regularity bearing in mind such associations. The frequent and regular item sets will be huge, even when regularity and frequency are [...] Read more.
Many data mining studies have focused on mining positive associations among frequent and regular item sets. However, none have considered time and regularity bearing in mind such associations. The frequent and regular item sets will be huge, even when regularity and frequency are considered without any time consideration. Negative associations are equally important in medical databases, reflecting considerable discrepancies in medications used to treat various disorders. It is important to find the most effective negative associations. The mined associations should be as small as possible so that the most important disconnections can be found. This paper proposes a mining method that mines medical databases to find regular, frequent, closed, and maximal item sets that reflect minimal negative associations. The proposed algorithm reduces the negative associations by 70% when the maximal and closed properties have been used, considering any sample size, regularity, or frequency threshold. Full article
Show Figures

Figure 1

17 pages, 622 KiB  
Article
An Event-Centric Knowledge Graph Approach for Public Administration as an Enabler for Data Analytics
Computers 2024, 13(1), 17; https://doi.org/10.3390/computers13010017 - 05 Jan 2024
Viewed by 600
Abstract
In a continuously evolving environment, organizations, including public administrations, need to quickly adapt to change and make decisions in real-time. This requires having a real-time understanding of their context that can be achieved by adopting an event-native mindset in data management which focuses [...] Read more.
In a continuously evolving environment, organizations, including public administrations, need to quickly adapt to change and make decisions in real-time. This requires having a real-time understanding of their context that can be achieved by adopting an event-native mindset in data management which focuses on the dynamics of change compared to the state-based traditional approaches. In this context, this paper proposes the adoption of an event-centric knowledge graph approach for the holistic data management of all data repositories in public administration. Towards this direction, the paper proposes an event-centric knowledge graph model for the domain of public administration that captures these dynamics considering events as first-class entities for knowledge representation. The development of the model is based on a state-of-the-art analysis of existing event-centric knowledge graph models that led to the identification of core concepts related to event representation, on a state-of-the-art analysis of existing public administration models that identified the core entities of the domain, and on a theoretical analysis of concepts related to events, public services, and effective public administration in order to outline the context and identify the domain-specific needs for event modeling. Further, the paper applies the model in the context of Greek public administration in order to validate it and showcase the possibilities that arise. The results show that the adoption of event-centric knowledge graph approaches for data management in public administration can facilitate data analytics, continuous integration, and the provision of a 360-degree-view of end-users. We anticipate that the proposed approach will also facilitate real-time decision-making, continuous intelligence, and ubiquitous AI. Full article
Show Figures

Figure 1

20 pages, 1362 KiB  
Article
A Multi-Channel Packet Scheduling Approach to Improving Video Delivery Performance in Vehicular Networks
Computers 2024, 13(1), 16; https://doi.org/10.3390/computers13010016 - 04 Jan 2024
Viewed by 359
Abstract
When working with the Wireless Access in Vehicular Environment (WAVE) protocol stack, the multi-channel operation mechanism of the IEEE 1609.4 protocol may impact the overall network performance, especially when using video streaming applications. In general, packets delivered from the application layer during a [...] Read more.
When working with the Wireless Access in Vehicular Environment (WAVE) protocol stack, the multi-channel operation mechanism of the IEEE 1609.4 protocol may impact the overall network performance, especially when using video streaming applications. In general, packets delivered from the application layer during a Control Channel (CCH) time slot have to wait for transmission until the next Service Channel (SCH) time slot arrives. The accumulation of packets at the beginning of the latter time slot may introduce additional delays and higher contention when all the network nodes try, at the same time, to obtain access to the shared channel in order to send the delayed packets as soon as possible. In this work, we have analyzed these performance issues and proposed a new method, which we call SkipCCH, that helps the MAC layer to overcome the high contention produced by the packet transmission bursts at the beginning of every SCH slot. This high contention implies an increase in the number of packet losses, which directly impacts the overall network performance. With our proposal, streaming video in vehicular networks will provide a better quality of reconstructed video at the receiver side under the same network conditions. Furthermore, this method has particularly proven its benefits when working with Quality of Service (QoS) techniques, not only by increasing the received video quality but also because it avoids starvation of the lower-priority traffic. Full article
(This article belongs to the Special Issue Vehicular Networking and Intelligent Transportation Systems 2023)
Show Figures

Figure 1

2 pages, 328 KiB  
Correction
Correction: Nayak et al. Brain Tumour Classification Using Noble Deep Learning Approach with Parametric Optimization through Metaheuristics Approaches. Computers 2022, 11, 10
Computers 2024, 13(1), 15; https://doi.org/10.3390/computers13010015 - 03 Jan 2024
Viewed by 265
Abstract
Figure 1 was reproduced without the correct copyright permissions from the copyright holder (Medical Sciences) [...] Full article
Show Figures

Figure 4

25 pages, 9733 KiB  
Article
Blockchain-Powered Gaming: Bridging Entertainment with Serious Game Objectives
Computers 2024, 13(1), 14; https://doi.org/10.3390/computers13010014 - 03 Jan 2024
Viewed by 396
Abstract
The advancement and acceptance of new technologies often hinges on the level of understanding and trust among potential users. Blockchain technology, despite its broad applications across diverse sectors, is often met with skepticism due to a general lack of understanding and incidents of [...] Read more.
The advancement and acceptance of new technologies often hinges on the level of understanding and trust among potential users. Blockchain technology, despite its broad applications across diverse sectors, is often met with skepticism due to a general lack of understanding and incidents of illicit activities in the cryptocurrency domain. This study aims to demystify blockchain technology by providing an in-depth examination of its application in a novel blockchain-based card game, centered around renewable energy and sustainable resource management. This paper introduces a serious game that uses blockchain to enhance user interaction, ownership, and gameplay, demonstrating the technology’s potential to revolutionize the gaming industry. Notable aspects of the game, such as ownership of virtual assets, transparent transaction histories, trustless game mechanics, user-driven content creation, gasless transactions, and mechanisms for in-game asset trading and cross-platform asset reuse are analyzed. The paper discusses how these features, not only provide a richer gaming experience but also serve as effective tools for raising awareness about sustainable energy and resource management, thereby bridging the gap between entertainment and education. The case study offers valuable insights into how blockchain can create dynamic, secure, and participatory virtual environments, shifting the paradigm of traditional online gaming. Full article
(This article belongs to the Special Issue When Blockchain Meets IoT: Challenges and Potentials)
Show Figures

Figure 1

21 pages, 2681 KiB  
Article
A Comparative Study on Recent Automatic Data Fusion Methods
Computers 2024, 13(1), 13; https://doi.org/10.3390/computers13010013 - 30 Dec 2023
Viewed by 639
Abstract
Automatic data fusion is an important field of machine learning that has been increasingly studied. The objective is to improve the classification performance from several individual classifiers in terms of accuracy and stability of the results. This paper presents a comparative study on [...] Read more.
Automatic data fusion is an important field of machine learning that has been increasingly studied. The objective is to improve the classification performance from several individual classifiers in terms of accuracy and stability of the results. This paper presents a comparative study on recent data fusion methods. The fusion step can be applied at early and/or late stages of the classification procedure. Early fusion consists of combining features from different sources or domains to form the observation vector before the training of the individual classifiers. On the contrary, late fusion consists of combining the results from the individual classifiers after the testing stage. Late fusion has two setups, combination of the posterior probabilities (scores), which is called soft fusion, and combination of the decisions, which is called hard fusion. A theoretical analysis of the conditions for applying the three kinds of fusion (early, late, and late hard) is introduced. Thus, we propose a comparative analysis with different schemes of fusion, including weaknesses and strengths of the state-of-the-art methods studied from the following perspectives: sensors, features, scores, and decisions. Full article
Show Figures

Figure 1

19 pages, 4843 KiB  
Article
A Robust Timing Synchronization Algorithm Based on PSSS for LTE-V2X
Computers 2024, 13(1), 12; https://doi.org/10.3390/computers13010012 - 30 Dec 2023
Viewed by 371
Abstract
In recent years, Long-Term Evolution Vehicle-to-Everything (LTE-V2X) communication technology has received extensive attention. Timing synchronization is a crucial step in the receiving process, addressing Timing Offsets (TOs) resulting from random propagation delays, sampling frequency mismatches between the transmitter and receiver or a combination [...] Read more.
In recent years, Long-Term Evolution Vehicle-to-Everything (LTE-V2X) communication technology has received extensive attention. Timing synchronization is a crucial step in the receiving process, addressing Timing Offsets (TOs) resulting from random propagation delays, sampling frequency mismatches between the transmitter and receiver or a combination of both. However, the presence of high-speed relative movement between nodes and a low antenna height leads to a significant Doppler frequency offset, resulting in a low Signal-to-Noise Ratio (SNR) for received signals in LTE-V2X communication scenarios. This paper aims to investigate LTE-V2X technology with a specific focus on time synchronization. The research centers on the time synchronization method utilizing the Primary Sidelink Synchronization Signal (PSSS) and conducts a comprehensive analysis of existing algorithms, highlighting their respective advantages and disadvantages. On this basis, a robust timing synchronization algorithm for LTE-V2X communication scenarios is proposed. The algorithm comprises three key steps: coarse synchronization, frequency offset estimation and fine synchronization. Enhanced robustness is achieved through algorithm fusion, optimal decision threshold design and predefined frequency offset values. Furthermore, a hardware-in-the-loop simulation platform is established. The simulation results demonstrate a substantial performance improvement for the proposed algorithm compared to existing methods under adverse channel conditions characterized by high frequency offsets and low SNR. Full article
(This article belongs to the Special Issue Vehicular Networking and Intelligent Transportation Systems 2023)
Show Figures

Figure 1

25 pages, 2717 KiB  
Article
Towards Blockchain-Integrated Enterprise Resource Planning: A Pre-Implementation Guide
Computers 2024, 13(1), 11; https://doi.org/10.3390/computers13010011 - 26 Dec 2023
Viewed by 668
Abstract
In the face of numerous challenges in supply chain management, new technologies are being implemented to overcome obstacles and improve overall performance. Among these technologies, blockchain, a part of the distributed ledger family, offers several advantages when integrated with ERP systems, such as [...] Read more.
In the face of numerous challenges in supply chain management, new technologies are being implemented to overcome obstacles and improve overall performance. Among these technologies, blockchain, a part of the distributed ledger family, offers several advantages when integrated with ERP systems, such as transparency, traceability, and data security. However, blockchain remains a novel, complex, and costly technology. The purpose of this paper is to guide decision-makers in determining whether integrating blockchain technology with ERP systems is appropriate during the pre-implementation phase. This paper focuses on the literature reviews, theories, and expert opinions to achieve its objectives. It first provides an overview of blockchain technology, then discusses its potential benefits to the supply chain, and finally proposes a framework to assist decision-makers in determining whether blockchain meets the needs of their consortium and whether this integration aligns with available resources. The results highlight the complexity of blockchain, the importance of detailed and in-depth research in deciding whether to integrate blockchain technology into ERP systems, and future research prospects. The findings of this article also present the critical decisions to be made prior to the implementation of blockchain, in the event that decision-makers choose to proceed with blockchain integration. The findings of this article augment the existing literature and can be applied in real-world contexts by stakeholders involved in blockchain integration projects with ERP systems. Full article
Show Figures

Figure 1

17 pages, 950 KiB  
Article
Facilitating Communication in Neuromuscular Diseases: An Adaptive Approach with Fuzzy Logic and Machine Learning in Augmentative and Alternative Communication Systems
Computers 2024, 13(1), 10; https://doi.org/10.3390/computers13010010 - 26 Dec 2023
Viewed by 485
Abstract
Augmentative and alternative communication techniques (AAC) are essential to assist individuals facing communication difficulties. (1) Background: It is acknowledged that dynamic solutions that adjust to the changing needs of patients are necessary in the context of neuromuscular diseases. (2) Methods: In order address [...] Read more.
Augmentative and alternative communication techniques (AAC) are essential to assist individuals facing communication difficulties. (1) Background: It is acknowledged that dynamic solutions that adjust to the changing needs of patients are necessary in the context of neuromuscular diseases. (2) Methods: In order address this concern, a differential approach was suggested that entailed the prior identification of the disease state. This approach employs fuzzy logic to ascertain the disease stage by analyzing intuitive patterns; it is contrasted with two intelligent systems. (3) Results: The results indicate that the AAC system’s adaptability enhances with the progression of the disease’s phases, thereby ensuring its utility throughout the lifespan of the individual. Although the adaptive AAC system exhibits signs of improvement, an expanded assessment involving a greater number of patients is required. (4) Conclusions: Qualitative assessments of comparative studies shed light on the difficulties associated with enhancing accuracy and adaptability. This research highlights the significance of investigating the use of fuzzy logic or artificial intelligence methods in order to solve the issue of symptom variability in disease staging. Full article
Show Figures

Figure 1

16 pages, 1372 KiB  
Article
Custom ASIC Design for SHA-256 Using Open-Source Tools
Computers 2024, 13(1), 9; https://doi.org/10.3390/computers13010009 - 25 Dec 2023
Viewed by 668
Abstract
The growth of digital communications has driven the development of numerous cryptographic methods for secure data transfer and storage. The SHA-256 algorithm is a cryptographic hash function widely used for validating data authenticity, identity, and integrity. The inherent SHA-256 computational overhead has motivated [...] Read more.
The growth of digital communications has driven the development of numerous cryptographic methods for secure data transfer and storage. The SHA-256 algorithm is a cryptographic hash function widely used for validating data authenticity, identity, and integrity. The inherent SHA-256 computational overhead has motivated the search for more efficient hardware solutions, such as application-specific integrated circuits (ASICs). This work presents a custom ASIC hardware accelerator for the SHA-256 algorithm entirely created using open-source electronic design automation tools. The integrated circuit was synthesized using SkyWater SKY130 130 nm process technology through the OpenLANE automated workflow. The proposed final design is compatible with 32-bit microcontrollers, has a total area of 104,585 µm2, and operates at a maximum clock frequency of 97.9 MHz. Several optimization configurations were tested and analyzed during the synthesis phase to enhance the performance of the final design. Full article
(This article belongs to the Special Issue Feature Papers in Computers 2023)
Show Figures

Figure 1

33 pages, 8073 KiB  
Article
The Doubly Linked Tree of Singly Linked Rings: Providing Hard Real-Time Database Operations on an FPGA
Computers 2024, 13(1), 8; https://doi.org/10.3390/computers13010008 - 24 Dec 2023
Viewed by 590
Abstract
We present a hardware data structure specifically designed for FPGAs that enables the execution of the hard real-time database CRUD operations using a hybrid data structure that combines trees and rings. While the number of rows and columns has to be limited for [...] Read more.
We present a hardware data structure specifically designed for FPGAs that enables the execution of the hard real-time database CRUD operations using a hybrid data structure that combines trees and rings. While the number of rows and columns has to be limited for hard real-time execution, the actual content can be of any size. Our structure restricts full navigational freedom to every but the leaf layer, thus keeping the memory overhead for the data stored in the leaves low. Although its nodes differ in function, all have exactly the same size and structure, reducing the number of cascaded decisions required in the database operations. This enables fast and efficient hardware implementation on FPGAs. In addition to the usual comparison with known data structures, we also analyze the tradeoff between the memory consumption of our approach and a simplified version that is doubly linked in all layers. Full article
(This article belongs to the Special Issue Advances in Database Engineered Applications 2023)
Show Figures

Figure 1

Back to TopTop