Machine learning (ML), a computational self-learning platform, is expected to be applied in a variety of settings. ML, on the other hand, uses a model built with a learning structure rather than traditional code that is written line by line in a continuous pattern. These models are created and equipped to determine the results of training using historical data. Scalability is a major challenge in real machine learning programs. Many ML-based technologies are essential to quickly analyze new data and create forecasts, as forecasts become meaningless after a few ticks (think real-time methods such as stock markets and clickstream data). Many machine-learning programs, on the other hand, need to be able to scale and train with gigabytes or terabytes of data during model training (As is found in the model from a web-scale image corpus). High-dimensional challenges pose new obstacles to machine learning professionals who are increasingly interested in scalability as well as algorithm quality. Against the backdrop of the current situation, this overview article on the scope of scalability in machine learning platforms collects, investigates, and analyzes the current state, aspects, and perspectives of scalability that can be added to machine learning platforms in a variety of ways to improve efficiency. The purpose is to do. Reliability when processing large amounts of data.
CITATION STYLE
Sharma, V. (2022). A Study on Data Scaling Methods for Machine Learning. International Journal for Global Academic & Scientific Research, 1(1). https://doi.org/10.55938/ijgasr.v1i1.4
Mendeley helps you to discover research relevant for your work.