This is a Logstash input plugin for Google Pub/Sub. The plugin can subscribe to a topic and ingest messages.
The main motivation behind the development of this plugin was to ingest Stackdriver Logging messages via the Exported Logs feature of Stackdriver Logging.
It is fully free and fully open source. The license is Apache 2.0, meaning you are pretty much free to use it however you want in whatever way.
You must first create a Google Cloud Platform project and enable the the Google Pub/Sub API. If you intend to use the plugin ingest Stackdriver Logging messages, you must also enable the Stackdriver Logging API and configure log exporting to Pub/Sub. There is plentiful information on https://cloud.google.com/ to get started,
- Google Cloud Platform Projects and Overview
- Goolge Cloud Pub/Sub documentation
- Stackdriver Logging documentation
Currently, this module requires you to create a topic
manually and specify
it in the logstash config file. You must also specify a subscription
, but
the plugin will attempt to create the pull-based subscription
on its own.
All messages received from Pub/Sub will be converted to a logstash event
and added to the processing pipeline queue. All Pub/Sub messages will be
acknowledged
and removed from the Pub/Sub topic
(please see more about
Pub/Sub concepts).
It is generally assumed that incoming messages will be in JSON and added to
the logstash event
as-is. However, if a plain text message is received, the
plugin will return the raw text in as raw_message
in the logstash event
.
You have two options for authentication depending on where you run Logstash.
- If you are running Logstash outside of Google Cloud Platform, then you will
need to create a Google Cloud Platform Service Account and specify the full
path to the JSON private key file in your config in
json_key_file
setting. If you don't want to upload the file and use its content as a string, then add the content of JSON private key injson_key_file_content
setting. You must assign sufficient roles to the Service Account to create a subscription and to pull messages from the subscription. Learn more about GCP Service Accounts and IAM roles here:
- If you are running Logstash on a Google Compute Engine instance, you may opt to use Application Default Credentials. In this case, you will not need to specify a JSON private key file in your config.
If you intend to use the logstash plugin for Stackdriver Logging message
ingestion, you must first manually set up the Export option to Coud Pub/Sub and
the manually create the topic
. Please see the more detailed instructions at,
Exported Logs
and ensure that the necessary permissions
have also been manually configured.
Logging messages from Stackdriver Logging exported to Pub/Sub are received as
JSON and converted to a logstash event
as-is in
this format.
Below is a copy of the included example.conf-tmpl
file that shows a basic
configuration for this plugin.
input {
google_pubsub {
# Your GCP project id (name)
project_id => "my-project-1234"
# The topic name below is currently hard-coded in the plugin. You
# must first create this topic by hand and ensure you are exporting
# logging to this pubsub topic.
topic => "logstash-input-dev"
# The subscription name is customizeable. The plugin will attempt to
# create the subscription (but use the hard-coded topic name above).
subscription => "logstash-sub"
# If you are running logstash within GCE, it will use
# Application Default Credentials and use GCE's metadata
# service to fetch tokens. However, if you are running logstash
# outside of GCE, you will need to specify the service account's
# JSON key file below.
#json_key_file => "/home/erjohnso/pkey.json"
}
}
output { stdout { codec => rubydebug } }
Logstash provides infrastructure to automatically generate documentation for this plugin. We use the asciidoc format to write documentation so any comments in the source code will be first converted into asciidoc and then into html. All plugin documentation are placed under one central location.
- For formatting code or config example, you can use the asciidoc
[source,ruby]
directive - For more asciidoc formatting tips, see the excellent reference here https://github.com/elastic/docs#asciidoc-guide
Need help? Try #logstash on freenode IRC or the https://discuss.elastic.co/c/logstash discussion forum.
-
To get started, you'll need JRuby with the Bundler gem installed.
-
You'll also need a Logstash installation to build the plugin against.
-
Create a new plugin or clone and existing from the GitHub logstash-plugins organization. We also provide example plugins.
-
export LOGSTASH_SOURCE=1
and pointLOGSTASH_PATH
to a local Logstash e.g.export LOGSTASH_PATH=/opt/local/logstash-8.7.0
-
Install Ruby dependencies
bundle install
- Install Java dependencies - regenerates the lib/logstash-input-google_pubsub_jars.rb script used to load the .jar dependencies when the plugin starts.
./gradlew vendor
NOTE: This step is necessary whenever build.gradle is updated.
- Update your dependencies
bundle install
- Run Ruby tests
bundle exec rspec
- Edit Logstash
Gemfile
and add the local plugin path, for example:
gem "logstash-filter-awesome", :path => "/your/local/logstash-filter-awesome"
- Install plugin
bin/logstash-plugin install --no-verify
- Run Logstash with your plugin
bin/logstash -e 'filter {awesome {}}'
At this point any modifications to the plugin code will be applied to this local Logstash setup. After modifying the plugin, simply rerun Logstash.
You can use the same 2.1 method to run your plugin in an installed Logstash by editing its Gemfile
and pointing the :path
to your local plugin development directory or you can build the gem and install it using:
- Build your plugin gem
gem build logstash-filter-awesome.gemspec
- Install the plugin from the Logstash home
bin/logstash-plugin install --no-verify
- Start Logstash and proceed to test the plugin
All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.
Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.
It is more important to the community that you are able to contribute.
For more information about contributing, see the CONTRIBUTING file