Peechara Ravinder Rao, V Sucharita
Department of Computer Science, Koneru Lakshmaiah Education Foundation, Guntur, Andra Pradesh, India.
Department of Computer Science and Engineering, Koneru Lakshmaiah Education FoundationGuntur, Andra Pradesh, India.
PeerJ Comput Sci. 2021 Aug 9;7:e628. doi: 10.7717/peerj-cs.628. eCollection 2021.
Data exchange over the Internet and other access channels is on the rise, leads to the insecurity of consequences. Many experiments have been conducted to investigate time-efficient and high-randomized encryption methods for the data. The latest studies, however, have still been debated because of different factors. The study outcomes do not yield completely random keys for encryption methods that are longer than this. Prominent repetition makes the processes predictable and susceptible to assaults. Furthermore, recently generated keys need recent algorithms to run at a high volume of transactional data successfully. In this article, the proposed solutions to these two critical issues are presented. In the beginning, one must use the chaotic series of events for generating keys is sufficient to obtain a high degree of randomness. Moreover, this work also proposes a novel and non-traditional validation test to determine the true randomness of the keys produced from a correlation algorithm. An approximate 100% probability of the vital phase over almost infinitely long-time intervals minimizes the algorithms' complexity for the higher volume of data security. It is suggested that these algorithms are mainly intended for cloud-based transactions. Data volume is potentially higher and extremely changeable 3% to 4% of the improvement in data transmission time with suggested algorithms. This research has the potential to improve communication systems over ten years by unblocking decades-long bottlenecks.
通过互联网和其他接入渠道进行的数据交换正在增加,这导致了不安全的后果。已经进行了许多实验来研究数据的高效且高度随机化的加密方法。然而,由于不同因素,最新的研究仍存在争议。对于比这更长的加密方法,研究结果并未产生完全随机的密钥。显著的重复使得过程可预测且容易受到攻击。此外,最近生成的密钥需要最新的算法才能成功处理大量的交易数据。在本文中,针对这两个关键问题提出了相应的解决方案。首先,使用混沌事件序列来生成密钥足以获得高度的随机性。此外,这项工作还提出了一种新颖的非传统验证测试,以确定由相关算法生成的密钥的真正随机性。在几乎无限长的时间间隔内,关键阶段的近似 100%概率可将算法的复杂度降至最低,以实现更高的数据安全性。建议这些算法主要用于基于云的交易。数据量可能更高且变化极大,使用建议算法可使数据传输时间提高 3%至 4%。这项研究有可能通过消除长达数十年的瓶颈来在十年内改善通信系统。