Skip to content

Commit

Permalink
Refactor input to be able to set inputs as a pure hash (Fixes #235)
Browse files Browse the repository at this point in the history
- Also added .sync.yml to set default rspec moch and configure .travis.yml after pdk update
- Renamed prospector to inputs and prospector5 to prospector
  • Loading branch information
trunet committed Jan 10, 2020
1 parent 9db03fe commit 93482e8
Show file tree
Hide file tree
Showing 10 changed files with 453 additions and 390 deletions.
5 changes: 5 additions & 0 deletions .sync.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
mock_with: ':mocha'

.travis.yml:
user: pcfens
secure: "z1SbP/Hisr5k66XL/ACLsZ/fG7cCpwl8apjZzt/YciWizwReioU2EkLr5tvXdUC10aIH6H7XBdA9XwPqwXa81cIqcdIHlRMIbosMUGYaXcUm1xhctB3GvEDqsxFqdZSHYXax+IR6Wt507Eop+iU3S5pf/zJcp4uSKQVapCMoeVCEQYLRwllgeaqtEUZwqOUwPk31C4YZxwrzmgbIVyXmPrp3SDToXaQm4S4RkayOqHH2lYi8isz3IPPQvDZY5681TBpo35AbsIRbhiLzGlBHbgRaE2dz7J1Gs8MBGFyrtDaPtc9UpbgEmyxgmaPs3NIeZkmfVoosjt2AHRsoMZB7ntaPAQ20mk44ugMhxd5HX8t7QdLPiYQqgA3O4QfKraxPzdEjYVs9Pf7BBgY4JpGSOAD3dlWNK0U40MzKe74cj6dshg9SfIdyf3M3MmI0KIIvdKhpgl8mSIL8MCWjnYYNpQMQDFgyrXvePnkPVlt7zlBxn+LJFFx3VLGNfSWbKavITM/nrvjpFkQZ34mPHTtTUYnT6HVehtwPd5x6ILqYcppEeeiloa4uLWhW/vg0wAOdOBv2IALdAqRMC56ODPK33gFRkX+CclsegtOh2In407njbXXZBQrY5h3SXuEVxZcFhGVTxJIV29viuWFSm7VF0a7IUmEbVrM23bqeaM+aOgs="
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -51,4 +51,4 @@ deploy:
on:
tags: true
all_branches: true
condition: "$DEPLOY_TO_FORGE = yes"
condition: "$DEPLOY_TO_FORGE = yes"
5 changes: 4 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,9 @@ input declarations down the hiera hierarchy. That behavior can be changed by con
[lookup_options](https://docs.puppet.com/puppet/latest/reference/lookup_quick.html#setting-lookupoptions-in-data)
flag.

`inputs` can be a Hash that will follow all the parameters listed on this documentation or an
Array that will output as is to the input config file.

### Usage on Windows

When installing on Windows, this module will download the windows version of Filebeat from
Expand Down Expand Up @@ -266,7 +269,7 @@ Installs and configures filebeat.
- `fields_under_root`: [Boolean] If set to true, custom fields are stored in the top level instead of under fields
- `disable_config_test`: [Boolean] If set to true, configuration tests won't be run on config files before writing them.
- `processors`: [Hash] Processors that should be configured.
- `inputs`: [Hash] Inputs that will be created. Commonly used to create inputs using hiera
- `inputs`: [Hash] or [Array] Inputs that will be created. Commonly used to create inputs using hiera
- `setup`: [Hash] Setup that will be created. Commonly used to create setup using hiera
- `xpack`: [Hash] XPack configuration to pass to filebeat

Expand Down
10 changes: 7 additions & 3 deletions manifests/init.pp
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@
# @param fields [Hash] Optional fields that should be added to each event output
# @param fields_under_root [Boolean] If set to true, custom fields are stored in the top level instead of under fields
# @param processors [Array] Processors that will be added. Commonly used to create processors using hiera.
# @param inputs [Hash] Inputs that will be created. Commonly used to create inputs using hiera
# @param inputs [Hash] or [Array] Inputs that will be created. Commonly used to create inputs using hiera
# @param setup [Hash] setup that will be created. Commonly used to create setup using hiera
# @param inputs_merge [Boolean] Whether $inputs should merge all hiera sources, or use simple automatic parameter lookup
# proxy_address [String] Proxy server to use for downloading files
Expand Down Expand Up @@ -87,7 +87,7 @@
Boolean $fields_under_root = $filebeat::params::fields_under_root,
Boolean $disable_config_test = $filebeat::params::disable_config_test,
Array $processors = [],
Hash $inputs = {},
Variant[Hash, Array] $inputs = {},
Hash $setup = {},
Array $modules = [],
Optional[Variant[Stdlib::HTTPUrl, Stdlib::HTTPSUrl]] $proxy_address = undef, # lint:ignore:140chars
Expand Down Expand Up @@ -144,7 +144,11 @@

if $package_ensure != 'absent' {
if !empty($inputs) {
create_resources('filebeat::input', $inputs)
if $inputs =~ Array {
create_resources('filebeat::input', { 'inputs' => { pure_array => true } })
} else {
create_resources('filebeat::input', $inputs)
}
}
}
}
5 changes: 3 additions & 2 deletions manifests/input.pp
Original file line number Diff line number Diff line change
Expand Up @@ -48,11 +48,12 @@
Boolean $symlinks = false,
Optional[String] $pipeline = undef,
Array $processors = [],
Boolean $pure_array = false,
) {

$input_template = $filebeat::major_version ? {
'5' => 'prospector5.yml.erb',
default => 'prospector.yml.erb',
'5' => 'prospector.yml.erb',
default => 'input.yml.erb',
}

if 'filebeat_version' in $facts and $facts['filebeat_version'] != false {
Expand Down
42 changes: 42 additions & 0 deletions spec/defines/input_spec.rb
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,21 @@
],
},
},
inputs => [
{
"type" => "logs",
"paths" => [
"/var/log/auth.log",
"/var/log/syslog",
],
},
{
"type" => "syslog",
"protocol.tcp" => {
"host" => "0.0.0.0:514",
},
},
],
}'
end

Expand Down Expand Up @@ -65,6 +80,33 @@
end
end

on_supported_os(facterversion: '2.4').each do |os, os_facts|
context "with array input support on #{os}" do
let(:facts) { os_facts }

# Docker Support
let(:title) { 'test-array' }
let(:params) do
{
'pure_array' => true,
}
end

if os_facts[:kernel] == 'Linux'
it { is_expected.to compile }

it {
is_expected.to contain_file('filebeat-test-array').with(
notify: 'Service[filebeat]',
)
is_expected.to contain_file('filebeat-test-array').with_content(
%r{- type: logs\n\s{2}paths:\n\s{2}- "/var/log/auth.log"\n\s{2}- "/var/log/syslog"\n- type: syslog\n\s{2}protocol.tcp:\n\s{4}host: 0.0.0.0:514\n},
)
}
end
end
end

context 'with no parameters' do
let(:title) { 'test-logs' }
let(:params) do
Expand Down
4 changes: 4 additions & 0 deletions spec/spec_helper.rb
Original file line number Diff line number Diff line change
@@ -1,3 +1,7 @@
RSpec.configure do |c|
c.mock_with :mocha
end

require 'puppetlabs_spec_helper/module_spec_helper'
require 'rspec-puppet-facts'

Expand Down
205 changes: 205 additions & 0 deletions templates/input.yml.erb
Original file line number Diff line number Diff line change
@@ -0,0 +1,205 @@
<%- if @pure_array -%>
<%= scope['filebeat::inputs'].to_yaml() %>
<%- else -%>
---
- type: <%= @input_type %>
<%- if @input_type == 'docker' -%>
containers:
ids:
<%- @containers_ids.each do |id| -%>
- <%= id %>
<%- end -%>
path: <%= @containers_path %>
stream: <%= @containers_stream %>
combine_partial: <%= @combine_partial %>
cri.parse_flags: <%= @cri_parse_flags %>
<%- elsif @input_type == 'syslog' -%>
protocol.<%= @syslog_protocol %>:
host: <%= @syslog_host %>
<%- else -%>
paths:
<%- @paths.each do |log_path| -%>
- <%= log_path %>
<%- end -%>
<%- if @encoding -%>
encoding: <%= @encoding %>
<%- end -%>
<%- if @include_lines.length > 0 -%>
include_lines:
<%- @include_lines.each do |include_line| -%>
- '<%= include_line %>'
<%- end -%>
<%- end -%>
<%- if @exclude_lines.length > 0 -%>
exclude_lines:
<%- @exclude_lines.each do |exclude_line| -%>
- '<%= exclude_line %>'
<%- end -%>
<%- end -%>
<%- if @exclude_files.length > 0 -%>
exclude_files:
<%- @exclude_files.each do |exclude_file| -%>
- <%= exclude_file %>
<%- end -%>
<%- end -%>
<%- if @fields.length > 0 -%>
fields:
<%- @fields.each_pair do |k, v| -%>
<%= k %>: <%= v %>
<%- end -%>
<%- end -%>
fields_under_root: <%= @fields_under_root %>
<%- if @tags.length > 0 -%>
tags:
<%- @tags.each do |tag| -%>
- <%= tag %>
<%- end -%>
<%- end -%>
<%- if @ignore_older -%>
ignore_older: <%= @ignore_older %>
<%- end -%>
<%- if @doc_type -%>
document_type: <%= @doc_type %>
<%- end -%>
<%- if @scan_frequency -%>
scan_frequency: <%= @scan_frequency %>
<%- end -%>
<%- if @harvester_buffer_size -%>
harvester_buffer_size: <%= @harvester_buffer_size %>
<%- end -%>
<%- if @max_bytes -%>
max_bytes: <%= @max_bytes %>
<%- end -%>
<%- if @symlinks -%>
symlinks: <%= @symlinks %>
<%- end -%>
<%- if @close_older -%>
close_older: <%= @close_older %>
<%- end -%>
<%- if @force_close_files -%>
force_close_files: <%= @force_close_files %>
<%- end -%>
<%- if @pipeline -%>
pipeline: <%= @pipeline %>
<%- end -%>

<%- if @json.length > 0 -%>
### JSON configuration
json:
# Decode JSON options. Enable this if your logs are structured in JSON.
# JSON key on which to apply the line filtering and multiline settings. This key
# must be top level and its value must be string, otherwise it is ignored. If
# no text key is defined, the line filtering and multiline features cannot be used.
<%- if @json['message_key'] != nil-%>
message_key: '<%= @json['message_key'] %>'
<%- end -%>

# By default, the decoded JSON is placed under a "json" key in the output document.
# If you enable this setting, the keys are copied top level in the output document.
<%- if @json['keys_under_root'] != nil -%>
keys_under_root: <%= @json['keys_under_root'] %>
<%- end -%>

# If keys_under_root and this setting are enabled, then the values from the decoded
# JSON object overwrite the fields that Filebeat normally adds (type, source, offset, etc.)
# in case of conflicts.
<%- if @json['overwrite_keys'] != nil -%>
overwrite_keys: <%= @json['overwrite_keys'] %>
<%- end -%>

# If this setting is enabled, Filebeat adds a "json_error" key in case of JSON
# unmarshaling errors or when a text key is defined in the configuration but cannot
# be used.
<%- if @json['add_error_key'] != nil -%>
add_error_key: <%= @json['add_error_key'] %>
<%- end -%>
<%- end -%>

<%- if @multiline.length > 0 -%>
multiline:
<%- if @multiline['pattern'] -%>
pattern: '<%= @multiline['pattern'] %>'
<%- end -%>
<%- if @multiline['negate'] -%>
negate: <%= @multiline['negate'] %>
<%- end -%>
<%- if @multiline['match'] -%>
match: <%= @multiline['match'] %>
<%- end -%>
<%- if @multiline['max_lines'] -%>
max_lines: <%= @multiline['max_lines'] %>
<%- end -%>
<%- if @multiline['timeout'] -%>
timeout: <%= @multiline['timeout'] %>
<%- end -%>
<%- end -%>
tail_files: <%= @tail_files %>

# Experimental: If symlinks is enabled, symlinks are opened and harvested. The harvester is openening the
# original for harvesting but will report the symlink name as source.
#symlinks: false

<%- if @backoff -%>
backoff: <%= @backoff %>
<%- end -%>
<%- if @max_backoff -%>
max_backoff: <%= @max_backoff %>
<%- end -%>
<%- if @backoff_factor -%>
backoff_factor: <%= @backoff_factor %>
<%- end -%>

# Experimental: Max number of harvesters that are started in parallel.
# Default is 0 which means unlimited
<%- if @harvester_limit -%>
harvester_limit: <%= @harvester_limit %>
<%- end -%>

### Harvester closing options

# Close inactive closes the file handler after the predefined period.
# The period starts when the last line of the file was, not the file ModTime.
# Time strings like 2h (2 hours), 5m (5 minutes) can be used.
<%- if @close_inactive -%>
close_inactive: <%= @close_inactive %>
<%- end -%>

# Close renamed closes a file handler when the file is renamed or rotated.
# Note: Potential data loss. Make sure to read and understand the docs for this option.
close_renamed: <%= @close_renamed %>

# When enabling this option, a file handler is closed immediately in case a file can't be found
# any more. In case the file shows up again later, harvesting will continue at the last known position
# after scan_frequency.
close_removed: <%= @close_removed %>

# Closes the file handler as soon as the harvesters reaches the end of the file.
# By default this option is disabled.
# Note: Potential data loss. Make sure to read and understand the docs for this option.
close_eof: <%= @close_eof %>

### State options

# Files for the modification data is older then clean_inactive the state from the registry is removed
# By default this is disabled.
<%- if @clean_inactive -%>
clean_inactive: <%= @clean_inactive %>
<%- end -%>

# Removes the state for file which cannot be found on disk anymore immediately
clean_removed: <%= @clean_removed %>

# Close timeout closes the harvester after the predefined time.
# This is independent if the harvester did finish reading the file or not.
# By default this option is disabled.
# Note: Potential data loss. Make sure to read and understand the docs for this option.
<%- if @close_timeout -%>
close_timeout: <%= @close_timeout %>
<%- end -%>
<%- if @processors.length > 0 -%>
# Managing processors releated only for specified input
processors:
<%- %><%= @processors.to_yaml.lines.drop(1).join.gsub(/^/, ' ') -%>
<%- end -%>
<%- end %>
<%- end %>
Loading

0 comments on commit 93482e8

Please sign in to comment.