Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 7 additions & 1 deletion .generator/schemas/v1/openapi.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5931,6 +5931,12 @@ components:
Scope down exclusion filter to only a subset of logs with a log query.
example: "*"
type: string
sample_attribute:
description: |-
Sample attribute to use for the sampling of logs going through this exclusion filter.
When set, only the logs with the specified attribute are sampled.
example: "@ci.job_id"
type: string
sample_rate:
description: |-
Sample rate to apply to logs going through this exclusion filter,
Expand Down Expand Up @@ -29894,7 +29900,7 @@ paths:
Update an index as identified by its name.
Returns the Index object passed in the request body when the request is successful.

Using the `PUT` method updates your indexs configuration by **replacing**
Using the `PUT` method updates your index's configuration by **replacing**
your current configuration with the new one sent to your Datadog organization.
operationId: UpdateLogsIndex
parameters:
Expand Down
1 change: 1 addition & 0 deletions examples/v1/logs-indexes/CreateLogsIndex.rb
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@
DatadogAPIClient::V1::LogsExclusion.new({
filter: DatadogAPIClient::V1::LogsExclusionFilter.new({
query: "*",
sample_attribute: "@ci.job_id",
sample_rate: 1.0,
}),
name: "payment",
Expand Down
1 change: 1 addition & 0 deletions examples/v1/logs-indexes/UpdateLogsIndex.rb
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@
DatadogAPIClient::V1::LogsExclusion.new({
filter: DatadogAPIClient::V1::LogsExclusionFilter.new({
query: "*",
sample_attribute: "@ci.job_id",
sample_rate: 1.0,
}),
name: "payment",
Expand Down
10 changes: 5 additions & 5 deletions features/v1/logs_indexes.feature
Original file line number Diff line number Diff line change
Expand Up @@ -11,21 +11,21 @@ Feature: Logs Indexes
@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Create an index returns "Invalid Parameter Error" response
Given new "CreateLogsIndex" request
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_attribute": "@ci.job_id", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
When the request is sent
Then the response status is 400 Invalid Parameter Error

@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Create an index returns "OK" response
Given new "CreateLogsIndex" request
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_attribute": "@ci.job_id", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
When the request is sent
Then the response status is 200 OK

@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Create an index returns "Unprocessable Entity" response
Given new "CreateLogsIndex" request
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "exclusion_filters": [{"filter": {"query": "*", "sample_attribute": "@ci.job_id", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "name": "main", "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
When the request is sent
Then the response status is 422 Unprocessable Entity

Expand Down Expand Up @@ -73,15 +73,15 @@ Feature: Logs Indexes
Scenario: Update an index returns "Invalid Parameter Error" response
Given new "UpdateLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "disable_daily_limit": false, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "disable_daily_limit": false, "exclusion_filters": [{"filter": {"query": "*", "sample_attribute": "@ci.job_id", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
When the request is sent
Then the response status is 400 Invalid Parameter Error

@generated @skip @team:DataDog/logs-backend @team:DataDog/logs-core
Scenario: Update an index returns "OK" response
Given new "UpdateLogsIndex" request
And request contains "name" parameter from "REPLACE.ME"
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "disable_daily_limit": false, "exclusion_filters": [{"filter": {"query": "*", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
And body with value {"daily_limit": 300000000, "daily_limit_reset": {"reset_time": "14:00", "reset_utc_offset": "+02:00"}, "daily_limit_warning_threshold_percentage": 70, "disable_daily_limit": false, "exclusion_filters": [{"filter": {"query": "*", "sample_attribute": "@ci.job_id", "sample_rate": 1.0}, "name": "payment"}], "filter": {"query": "source:python"}, "num_flex_logs_retention_days": 360, "num_retention_days": 15, "tags": ["team:backend", "env:production"]}
When the request is sent
Then the response status is 200 OK

Expand Down
2 changes: 1 addition & 1 deletion lib/datadog_api_client/v1/api/logs_indexes_api.rb
Original file line number Diff line number Diff line change
Expand Up @@ -355,7 +355,7 @@ def update_logs_index(name, body, opts = {})
# Update an index as identified by its name.
# Returns the Index object passed in the request body when the request is successful.
#
# Using the `PUT` method updates your indexs configuration by **replacing**
# Using the `PUT` method updates your index's configuration by **replacing**
# your current configuration with the new one sent to your Datadog organization.
#
# @param name [String] Name of the log index.
Expand Down
13 changes: 12 additions & 1 deletion lib/datadog_api_client/v1/models/logs_exclusion_filter.rb
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@ class LogsExclusionFilter
# Scope down exclusion filter to only a subset of logs with a log query.
attr_accessor :query

# Sample attribute to use for the sampling of logs going through this exclusion filter.
# When set, only the logs with the specified attribute are sampled.
attr_accessor :sample_attribute

# Sample rate to apply to logs going through this exclusion filter,
# a value of 1.0 excludes all logs matching the query.
attr_reader :sample_rate
Expand All @@ -36,6 +40,7 @@ class LogsExclusionFilter
def self.attribute_map
{
:'query' => :'query',
:'sample_attribute' => :'sample_attribute',
:'sample_rate' => :'sample_rate'
}
end
Expand All @@ -45,6 +50,7 @@ def self.attribute_map
def self.openapi_types
{
:'query' => :'String',
:'sample_attribute' => :'String',
:'sample_rate' => :'Float'
}
end
Expand All @@ -71,6 +77,10 @@ def initialize(attributes = {})
self.query = attributes[:'query']
end

if attributes.key?(:'sample_attribute')
self.sample_attribute = attributes[:'sample_attribute']
end

if attributes.key?(:'sample_rate')
self.sample_rate = attributes[:'sample_rate']
end
Expand Down Expand Up @@ -121,6 +131,7 @@ def ==(o)
return true if self.equal?(o)
self.class == o.class &&
query == o.query &&
sample_attribute == o.sample_attribute &&
sample_rate == o.sample_rate &&
additional_properties == o.additional_properties
end
Expand All @@ -129,7 +140,7 @@ def ==(o)
# @return [Integer] Hash code
# @!visibility private
def hash
[query, sample_rate, additional_properties].hash
[query, sample_attribute, sample_rate, additional_properties].hash
end
end
end
Loading