College of Artificial Intelligence and Big Data, Chongqing Industry Polytechnic College, Chongqing, China.
Comput Intell Neurosci. 2022 Mar 19;2022:3411959. doi: 10.1155/2022/3411959. eCollection 2022.
With the continuous reform and innovation of Internet technology and the continuous development and progress of social economy, Big Data cloud computing technology is more and more widely used in people's work and life. Many parallel algorithms play a very important role in solving large linear equations in various applications. To this end, this article aims to propose and summarize a cloud computing task scheduling model that relies on the solution of large linear equations. The method of this paper is to study the technology of solving large-scale linear equations and propose an M-QoS-OCCSM scheduling model. The function of the experimental method is to solve the problem of efficiently executing mutually dependent parallel tasks within limited resources, while fully satisfying users' expectations of task completion time, bandwidth rate, reliability, and cost. In this paper, the application experiment of large-scale linear equations in task scheduling is used to study task scheduling algorithms. The results show that when the task load is 10 and 20, the convergence speed of the MPQGA algorithm is 32 seconds and 95 seconds faster than that of the BGA algorithm, respectively.
随着互联网技术的不断改革创新和社会经济的不断发展进步,大数据云计算技术越来越广泛地应用于人们的工作和生活中。许多并行算法在各种应用中求解大型线性方程组中发挥着非常重要的作用。为此,本文旨在提出并总结一种基于求解大型线性方程组的云计算任务调度模型。本文的方法是研究求解大规模线性方程组的技术,并提出一种 M-QoS-OCCSM 调度模型。实验方法的功能是在有限的资源内高效执行相互依赖的并行任务,同时充分满足用户对任务完成时间、带宽率、可靠性和成本的期望。本文通过大规模线性方程组在任务调度中的应用实验,研究任务调度算法。结果表明,当任务负载为 10 和 20 时,MPQGA 算法的收敛速度分别比 BGA 算法快 32 秒和 95 秒。