We’ve previously written about how event and endpoint logs are essential for maintaining endpoint and overall system security. A breach at an endpoint could quickly bring attackers straight to the heart of your IT systems and networks. Our previous post showed you the importance of centralizing operating system logs and shipping Linux logs to Apica using Fluent Bit. In this post, we’ll throw some light upon centralizing Windows logs on Apica.
Why centralize Windows logs?
Windows devices form a majority in most business IT ecosystems and networks. You’d often find employee laptops and desktops and even web servers running on Windows. Windows creates log records for events occurring within software and hardware components. Windows creates event logs for the following event categories:
- Logon events
- Security events like authentication failures, login attempts, or file deletion
- Directory service access and operations events
- Policy changes
- System events like driver failures, reboots, and startup failures
You can access Windows logs for a Windows device using Event Viewer. Accessing Event Viewer on a laptop or desktop leads you to a long list of Windows-recorded events on that machine. Now, multiply that by the number of Windows devices on your network, and you’ll realize that there’s way too much data your system admins can deal with sanely without effective log centralization and data management. Here’s where Apica can help.
Why use Apica for log data convergence?
Apica is a data platform built for engineering teams that need on-demand and real-time machine data, regardless of the source or time of ingestion. The Apica platform offers numerous benefits to log centralization. Although you can use solutions like Windows Operations Manager to centralize logs and monitor services, devices, and operations across your Windows environments from a single console, it can be pretty tricky to run and manage and is a little pricey.
By shipping your Windows logs to Apica, you can:
- Protect your Windows logs from accidental loss or inaccessibility during downtimes by storing them in a separate location.
- Create custom rules, automation triggers, and monitors with a few clicks
- Carry out complex searches using regular expressions and advanced queries without consuming the host’s computing resources.
- Store logs for longer durations on any S3-compatible object store and have them fully indexed and searchable.
- Reducing MTTD and MTTR by helping administrators get to root causes and fixing issues faster.
- Enable role-based access to Windows logs outside the system that generates them, thereby increasing data security and promoting cross-team collaboration.
- Comparing and correlating events across the application and infrastructure layers.
- Augment log data with security events using built-in SIEM rules and detect security-related events across your Windows systems in real-time.
Besides, you also get to visualize your Windows log data and metrics and create alerts for thresholds and events that trigger remediation or other process automation workflows in your ITOM toolkit.
Shipping Windows logs to Apica using Fluent Bit
Several Windows log forwarders help ship logs to an external server or endpoint. These include NXLog, Windows Event Forwarding, and Fluent Bit. We’ve developed a simple script that helps you set up log forwarding to Apica using Fluent Bit.
You’ll need access to a Apica instance to try out the following steps. If you do not have access to a Apica instance, you can sign up for a 14-day free trial of Apica SaaS. Alternatively, there are plenty of deployment options to choose from if you’d like to try out Apica PaaS. You could go for the free-forever Apica PaaS Community Edition or try out our straightforward deployments on Kubernetes, AWS, MicroK8s, or Docker.
To forward your Windows logs to Apica using Fluent Bit, do the following.
- Check the connectivity from your Windows machine to your Apica cluster by sending the following HTTP payload using the
Invoke-Restmethod
PowerShell utility. Ensure that you change the@timestamp
below to match your current date and time before executing theInvoke-Restmethod
.
$body = @{"message"="Apica JSON services are up"
"@timestamp"="2021-09-23T06:25:18Z"
"host"="curl_host"
"proc_id"="json-batch-test"
"app_name"="curl"
"namespace"="windows-curl-namespace"
"cluster_id"="Apica-json-batch-test"
} | ConvertTo-Json
$header = @{
"Accept"="application/json"
"Authorization"="Bearer <token>"
"Content-Type"="application/json"
}
- Execute the following
Invoke-Restmethod
to send the above payload to your Apica instance endpoint.
Invoke-RestMethod -Uri " https://<Apica endpoint>/v1/json_batch" -Method 'Post' -Body $body -Headers $header | ConvertTo-HTML
- If the payload is sent successfully, create a temporary folder on your Windows machine and navigate to it (for example,
D:\test
) - In this folder, download the
fluent-install.ps1
PowerShell script and thefluent-bit.conf
file. When executed, the PowerShell script installs Fluent Bit on your Windows machine based on the configuration you provide in thefluent-bit.conf
file. - Configure the
[OUTPUT]
section of thefluent-bit.conf
file to match the details of your Apica cluster.
[OUTPUT]
Name http
Match *
Host localhost
Port 80
URI /v1/json_batch
Format json
tls off
tls.verify off
net.keepalive off
compress gzip
Header Authorization Bearer ${YOUR_Apica_INGEST_TOKEN}
- By default, Windows does not allow you to execute any scripts due to execution policies. Update your execution policies by running the following.
Set-ExecutionPolicy unrestricted
- Open PowerShell on your Windows machine and run it in Admin mode.
- Execute the following script to install Fluent Bit on your machine.
PS D:\test> .\fluentbit-install.ps1
[SC] CreateService SUCCESS
The fluent-bit service is starting.
The fluent-bit service was started successfully.
- Next, run
services.msc
. You should see that the Fluent Bit service we installed is now running.
Now, head over to your Apica instance. You should see all of your Windows logs being forwarded to Apica and appearing under the Windows
namespace.
Conclusion
If your IT environment contains a lot of components that run on Windows, there’s a lot to gain from centralizing your logs on Apica.
Apica is the world’s only unified data platform for real-time monitoring, observability, log aggregation, and analytics with an infinite storage scale without zero storage tax. Apica ships with a host of integrations and tooling that lets you exercise cross-platform, real-time monitoring, observability, and analysis, threat and bug forensics, and process automation – all while leveraging built-in robust security measures, promoting cross-team collaboration, and maintaining regulatory compliance. Moreover, the use of object storage also means that neither do we dictate how much log data you can store and for how long, nor do we force you to favor logging specific components of your environment over others – you get to log everything and store and manage all your data on your terms.
There’s always an option of building your own log management and analytics stack using open source tooling or by leveraging tools like Microsoft’s-own Operations Manager. However, this can be more challenging than it appears and may not scale the way you need the stack to. If you find the benefits and ease of integration compelling enough, do give Apica SaaS a try.