This paper presents the findings about implementing a machine learning (ML) technique to optimize the performance of 5 G mm wave applications utilizing multiple-input multiple-output (MIMO) antennas operating at the 28 GHz frequency band. This article examines various methodologies, including simulation, measurement, and the utilization of an RLC-equivalent circuit model, to evaluate the appropriateness of an antenna for its intended applications. In addition to its compact dimensions, the proposed design exhibits a maximum gain of 10.34 dBi, superior isolation exceeding 26 dB, and a broad bandwidth of 16.56 % Centered at 28 GHz and spanning from 25.905 to 30.544 GHz. Another supervised regression machine learning technique is utilized to predict the antenna's gain accurately. Machine learning (ML) models can be assessed by several measures, such as the variance score, R square, mean square error (MSE), mean absolute error (MAE), root mean square error (RMSE), and Mean Absolute Percentage Error (MAPE). Among the six machine learning models considered, it is seen that the Gaussian Process Regression (GPR) model exhibits the lowest error and achieves the highest level of accuracy in forecasting gain. The antenna under consideration has promising qualities for its intended use in high-band 5 G applications. This is evidenced by the modelling findings obtained from Computer Simulation Technology (CST) and Advanced Design System (ADS)and the measured and projected results derived using machine learning methodologies.