"Firewalls are critical components in securing
communication networks by screening all incoming (and
occasionally exiting) data packets. Filtering is carried out by
comparing incoming data packets to a set of rules designed to
prevent malicious code from entering the network. To regulate
the flow of data packets entering and leaving a network, an
Internet firewall keeps a track of all activity. While the primary
function of log files is to aid in troubleshooting and diagnostics,
the information they contain is also very relevant to system
audits and forensics. Firewall’s primary function is to prevent
malicious data packets from being sent. In order to better
defend against cyberattacks and understand when and how
malicious actions are influencing the internet, it is necessary to
examine log files. As a result, the firewall decides whether to
'allow,' 'deny,' 'drop,' or 'reset-both' the incoming and outgoing
packets. In this research, we apply various categorization
algorithms to make sense of data logged by a firewall device.
Harmonic mean F1 score, recall, and sensitivity measurement
data with a 99% accuracy score in the random forest technique
are used to compare the classifier's performance. To be sure, the
proposed characteristics did significantly contribute to
enhancing the firewall classification rate, as seen by the high
accuracy rates generated by the other methods.
"