Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ILM] Policy phases redesign #88671

Merged
merged 32 commits into from
Jan 27, 2021
Merged

Conversation

yuliacech
Copy link
Contributor

@yuliacech yuliacech commented Jan 19, 2021

Summary

This PR is a part of data tiers redesign work in ILM.

Changes include:

  • Page title layout
  • Policy name input layout
  • Vertical highlight for active phases (not completed yet)
  • Toggle for 'move to warm phase after rollover' is removed
  • Phase timing input layout
  • 'Settings' button toggle between phase description and phase form controls
  • All form controls in a phase use 'on/off' labels and hide any additional controls if disabled
  • All form controls have a new style (prepend label)
  • Changed concept for optional fields: number of replicas and index priority used to be optional fields, but now they can be disabled with the switch. That means, if those options are enabled, a value in the input fields are required.
  • 'save' and 'show request' buttons layout

Screenshots

Expand to see screenshots

Edit policy form

Screenshot 2021-01-19 at 16 08 58

Hot phase expanded form

Screenshot 2021-01-25 at 19 05 58

Warm phase expanded form

Screenshot 2021-01-25 at 19 06 38

Cold phase expanded form

Screenshot 2021-01-25 at 19 07 00

Checklist

Delete any items that are not applicable to this PR.

  • Unit or functional tests were updated or added to match the most common scenarios
  • Any UI touched in this PR is usable by keyboard only (learn more about keyboard accessibility)
  • Any UI touched in this PR does not create any new axe failures (run axe in browser: FF, Chrome)

How to test

  1. To install all dependencies yarn kbn bootstrap
  2. To start Kibana and Elasticsearch yarn start and yarn es snapshot in a separate terminal tab
  3. Navigate to Stack Management -> Index Lifecycle Policies and click 'create policy' button or click an existing policy in the list.

There is a possibility to test with the Amsterdam theme in Kibana:

  1. Navigate to Stack Management -> Advanced Settings
  2. Change Theme version from v7 to v8(beta)
  3. Click 'save changes' button in the bottom bar.

TODO in following PRs

  • Update default rollover switch
  • Complete visuals for 'active' phase highlight
  • Delete phase redesign
  • Bottom blocks for hot/warm/delete phases with timings

Release note

Data tier formalization in Elasticsearch offers a built-in way of stratifying nodes into different tiers, from fastest to slowest machines. Data is allocated to tiers using ILM phases, which prompted a redesign of the previous ILM UI and UX. The new UI and UX is designed to:

  • Enable users to understand how their data is flowing through phases and tiers and how this interacts with time spent in a phase (and tier).
  • Guide a user toward a pattern of storing data in a phase in its corresponding tier while preserving the ability to diverge from this pattern as needed.
  • Expand in complexity only as the user attempts more complex actions. The previous UI was a very unopinionated reflection of the underlying RESTful policy API, we wanted to give users more guidance than this and ensure that only the most important information is surfaced.

Screenshots

Toggle screenshots

localhost_5601_hlg_app_management_data_index_lifecycle_management_policies_edit (7)
localhost_5601_hlg_app_management_data_index_lifecycle_management_policies_edit (9)
Screenshot 2021-02-18 at 11 23 09
Screenshot 2021-02-18 at 11 23 25

Gif

Kapture 2021-02-18 at 12 38 08

@yuliacech yuliacech added Feature:ILM Team:Kibana Management Dev Tools, Index Management, Upgrade Assistant, ILM, Ingest Node Pipelines, and more v7.12.0 v8.0.0 release_note:enhancement labels Jan 19, 2021
@yuliacech
Copy link
Contributor Author

@elasticmachine merge upstream

@yuliacech
Copy link
Contributor Author

@elasticmachine merge upstream

@yuliacech yuliacech requested a review from jloleysens January 20, 2021 13:23
@yuliacech yuliacech marked this pull request as ready for review January 20, 2021 13:24
@yuliacech yuliacech requested review from a team as code owners January 20, 2021 13:24
@elasticmachine
Copy link
Contributor

Pinging @elastic/es-ui (Team:Elasticsearch UI)

@yuliacech yuliacech requested a review from mdefazio January 20, 2021 13:51
@yuliacech
Copy link
Contributor Author

@elasticmachine merge upstream

return (
isNewPolicy || // enable index priority for new policies
!policy.phases[phase]?.actions || // enable index priority for new phases
policy.phases[phase]?.actions?.set_priority != null // enable index priority if it's set
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @jloleysens , here is a change to setting default index priority we discussed. Could you please have a look?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, this makes sense to me! @andreidan @dakrone , this is in line with the current index priority behaviour. I wonder if at this time we can change the past behavior and exclude index priority by default on phases. Currently, it is on for all phases by default for new policies, with defaults for hot being highest, warm second highest and cold lowest.

Copy link
Contributor

@jloleysens jloleysens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Excellent work @yuliacech, this is a very cool enhancement!

Happy for you to merge, it would be cool to drive the question about index priority to its conclusion here, but I think we can also return to it 👍🏻

return (
isNewPolicy || // enable index priority for new policies
!policy.phases[phase]?.actions || // enable index priority for new phases
policy.phases[phase]?.actions?.set_priority != null // enable index priority if it's set
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, this makes sense to me! @andreidan @dakrone , this is in line with the current index priority behaviour. I wonder if at this time we can change the past behavior and exclude index priority by default on phases. Currently, it is on for all phases by default for new policies, with defaults for hot being highest, warm second highest and cold lowest.

Copy link
Contributor

@mdefazio mdefazio left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks good! Thanks for working through the back and forth of the designs!

@yuliacech
Copy link
Contributor Author

@elasticmachine merge upstream

@yuliacech
Copy link
Contributor Author

@elasticmachine merge upstream

@yuliacech
Copy link
Contributor Author

@elasticmachine merge upstream

@yuliacech yuliacech added release_note:skip Skip the PR/issue when compiling release notes and removed release_note:enhancement labels Jan 27, 2021
@yuliacech yuliacech merged commit d931ed6 into elastic:master Jan 27, 2021
yuliacech added a commit to yuliacech/kibana that referenced this pull request Jan 27, 2021
* Phases redesign

* Title and name field layout

* Active highlight wip

* Copy comments

* Updated data allocation dropdown

* Min age error message

* Fixed tests

* Fixed edit policy integration tests

* Fixed more tests

* Cleaned up test files

* Use hotProperty instead of a string

* Clean up in phase component

* Fixed i18n files

* Updated optional fields

* Updated aria attributes after running axe tests

* Added review suggestions

* Reversed data allocation field changes

* Fixed type error

* Reversed on/off label and prepend input label

* Deleted property consts from phases components

* Removed not needed i18n consts and added i18n where missing

* Fixed merge conflicts with master

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
yuliacech added a commit that referenced this pull request Jan 28, 2021
* Phases redesign

* Title and name field layout

* Active highlight wip

* Copy comments

* Updated data allocation dropdown

* Min age error message

* Fixed tests

* Fixed edit policy integration tests

* Fixed more tests

* Cleaned up test files

* Use hotProperty instead of a string

* Clean up in phase component

* Fixed i18n files

* Updated optional fields

* Updated aria attributes after running axe tests

* Added review suggestions

* Reversed data allocation field changes

* Fixed type error

* Reversed on/off label and prepend input label

* Deleted property consts from phases components

* Removed not needed i18n consts and added i18n where missing

* Fixed merge conflicts with master

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>

Co-authored-by: Kibana Machine <42973632+kibanamachine@users.noreply.github.com>
@yuliacech yuliacech mentioned this pull request Feb 11, 2021
1 task
@jloleysens jloleysens added release_note:feature Makes this part of the condensed release notes and removed release_note:skip Skip the PR/issue when compiling release notes labels Feb 17, 2021
@kibanamachine
Copy link
Contributor

kibanamachine commented Feb 17, 2021

💔 Build Failed

Failed CI Steps


Test Failures

Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/ml/data_frame_analytics/cloning·ts.machine learning data frame analytics jobs cloning supported by UI form classification job supported by the form opens the existing job in the data frame analytics job wizard

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:00:00]         └-: machine learning
[00:00:00]           └-> "before all" hook
[00:00:00]           └-: 
[00:00:00]             └-> "before all" hook
[00:00:00]             └-> "before all" hook
[00:00:00]               │ debg creating role ft_ml_source
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_source]
[00:00:00]               │ debg creating role ft_ml_source_readonly
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_source_readonly]
[00:00:00]               │ debg creating role ft_ml_dest
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_dest]
[00:00:00]               │ debg creating role ft_ml_dest_readonly
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_dest_readonly]
[00:00:00]               │ debg creating role ft_ml_ui_extras
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_ui_extras]
[00:00:00]               │ debg creating role ft_default_space_ml_all
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space_ml_all]
[00:00:00]               │ debg creating role ft_default_space1_ml_all
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space1_ml_all]
[00:00:00]               │ debg creating role ft_all_spaces_ml_all
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_all_spaces_ml_all]
[00:00:00]               │ debg creating role ft_default_space_ml_read
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space_ml_read]
[00:00:00]               │ debg creating role ft_default_space1_ml_read
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space1_ml_read]
[00:00:00]               │ debg creating role ft_all_spaces_ml_read
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_all_spaces_ml_read]
[00:00:00]               │ debg creating role ft_default_space_ml_none
[00:00:00]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space_ml_none]
[00:00:00]               │ debg creating user ft_ml_poweruser
[00:00:00]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_poweruser]
[00:00:00]               │ debg created user ft_ml_poweruser
[00:00:00]               │ debg creating user ft_ml_poweruser_spaces
[00:00:00]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_poweruser_spaces]
[00:00:00]               │ debg created user ft_ml_poweruser_spaces
[00:00:00]               │ debg creating user ft_ml_poweruser_space1
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_poweruser_space1]
[00:00:01]               │ debg created user ft_ml_poweruser_space1
[00:00:01]               │ debg creating user ft_ml_poweruser_all_spaces
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_poweruser_all_spaces]
[00:00:01]               │ debg created user ft_ml_poweruser_all_spaces
[00:00:01]               │ debg creating user ft_ml_viewer
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_viewer]
[00:00:01]               │ debg created user ft_ml_viewer
[00:00:01]               │ debg creating user ft_ml_viewer_spaces
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_viewer_spaces]
[00:00:01]               │ debg created user ft_ml_viewer_spaces
[00:00:01]               │ debg creating user ft_ml_viewer_space1
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_viewer_space1]
[00:00:01]               │ debg created user ft_ml_viewer_space1
[00:00:01]               │ debg creating user ft_ml_viewer_all_spaces
[00:00:01]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_viewer_all_spaces]
[00:00:01]               │ debg created user ft_ml_viewer_all_spaces
[00:00:01]               │ debg creating user ft_ml_unauthorized
[00:00:02]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_unauthorized]
[00:00:02]               │ debg created user ft_ml_unauthorized
[00:00:02]               │ debg creating user ft_ml_unauthorized_spaces
[00:00:02]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_unauthorized_spaces]
[00:00:02]               │ debg created user ft_ml_unauthorized_spaces
[00:43:03]             └-: data frame analytics
[00:43:03]               └-> "before all" hook
[00:45:53]               └-: jobs cloning supported by UI form
[00:45:53]                 └-> "before all" hook
[00:45:53]                 └-> "before all" hook
[00:45:53]                   │ debg applying update to kibana config: {"dateFormat:tz":"UTC"}
[00:45:54]                   │ debg SecurityPage.forceLogout
[00:45:54]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=100
[00:45:54]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:45:54]                   │ debg Redirecting to /logout to force the logout
[00:45:54]                   │ debg Waiting on the login form to appear
[00:45:54]                   │ debg Waiting for Login Page to appear.
[00:45:54]                   │ debg Waiting up to 100000ms for login page...
[00:45:54]                   │ debg browser[INFO] http://localhost:6191/logout?_t=1613554740495 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:45:54]                   │
[00:45:54]                   │ debg browser[INFO] http://localhost:6191/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:45:54]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:45:57]                   │ERROR browser[SEVERE] http://localhost:6191/internal/security/me - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:45:57]                   │ debg browser[INFO] http://localhost:6191/login?msg=LOGGED_OUT 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:45:57]                   │
[00:45:57]                   │ debg browser[INFO] http://localhost:6191/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:45:57]                   │ debg --- retry.tryForTime error: .login-form is not displayed
[00:45:58]                   │ERROR browser[SEVERE] http://localhost:6191/internal/spaces/_active_space - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:45:58]                   │ERROR browser[SEVERE] http://localhost:6191/internal/security/me - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:45:58]                   │ debg browser[INFO] http://localhost:6191/39917/bundles/core/core.entry.js 12:159352 "Detected an unhandled Promise rejection.
[00:45:58]                   │      Error: Unauthorized"
[00:45:58]                   │ERROR browser[SEVERE] http://localhost:6191/39917/bundles/core/core.entry.js 5:3002 
[00:45:58]                   │ERROR browser[SEVERE] http://localhost:6191/api/licensing/info - Failed to load resource: the server responded with a status of 401 (Unauthorized)
[00:45:58]                   │ debg Find.existsByDisplayedByCssSelector('.login-form') with timeout=2500
[00:45:58]                   │ debg TestSubjects.exists(loginForm)
[00:45:58]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="loginForm"]') with timeout=2500
[00:45:58]                   │ debg Waiting for Login Form to appear.
[00:45:58]                   │ debg Waiting up to 100000ms for login form...
[00:45:58]                   │ debg TestSubjects.exists(loginForm)
[00:45:58]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="loginForm"]') with timeout=2500
[00:45:58]                   │ debg TestSubjects.setValue(loginUsername, ft_ml_poweruser)
[00:45:58]                   │ debg TestSubjects.click(loginUsername)
[00:45:58]                   │ debg Find.clickByCssSelector('[data-test-subj="loginUsername"]') with timeout=10000
[00:45:58]                   │ debg Find.findByCssSelector('[data-test-subj="loginUsername"]') with timeout=10000
[00:45:58]                   │ debg TestSubjects.setValue(loginPassword, mlp001)
[00:45:58]                   │ debg TestSubjects.click(loginPassword)
[00:45:58]                   │ debg Find.clickByCssSelector('[data-test-subj="loginPassword"]') with timeout=10000
[00:45:58]                   │ debg Find.findByCssSelector('[data-test-subj="loginPassword"]') with timeout=10000
[00:45:58]                   │ debg TestSubjects.click(loginSubmit)
[00:45:58]                   │ debg Find.clickByCssSelector('[data-test-subj="loginSubmit"]') with timeout=10000
[00:45:58]                   │ debg Find.findByCssSelector('[data-test-subj="loginSubmit"]') with timeout=10000
[00:45:58]                   │ proc [kibana]   log   [09:39:04.682] [info][plugins][routes][security] Logging in with provider "basic" (basic)
[00:45:58]                   │ debg Waiting for login result, expected: chrome.
[00:45:58]                   │ debg Find.findByCssSelector('[data-test-subj="kibanaChrome"] .app-wrapper:not(.hidden-chrome)') with timeout=20000
[00:46:00]                   │ debg browser[INFO] http://localhost:6191/app/home 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:46:00]                   │
[00:46:00]                   │ debg browser[INFO] http://localhost:6191/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:46:00]                   │ debg Finished login process currentUrl = http://localhost:6191/app/home#/
[00:46:00]                   │ debg Waiting up to 20000ms for logout button visible...
[00:46:00]                   │ debg TestSubjects.exists(userMenuButton)
[00:46:00]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenuButton"]') with timeout=2500
[00:46:00]                   │ debg TestSubjects.exists(userMenu)
[00:46:00]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenu"]') with timeout=2500
[00:46:03]                   │ debg --- retry.tryForTime error: [data-test-subj="userMenu"] is not displayed
[00:46:03]                   │ debg TestSubjects.click(userMenuButton)
[00:46:03]                   │ debg Find.clickByCssSelector('[data-test-subj="userMenuButton"]') with timeout=10000
[00:46:03]                   │ debg Find.findByCssSelector('[data-test-subj="userMenuButton"]') with timeout=10000
[00:46:03]                   │ debg TestSubjects.exists(userMenu)
[00:46:03]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenu"]') with timeout=120000
[00:46:03]                   │ debg TestSubjects.exists(userMenu > logoutLink)
[00:46:03]                   │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="userMenu"] [data-test-subj="logoutLink"]') with timeout=2500
[00:46:03]                 └-: classification job supported by the form
[00:46:03]                   └-> "before all" hook
[00:46:03]                   └-> "before all" hook
[00:46:03]                     │ info [ml/bm_classification] Loading "mappings.json"
[00:46:03]                     │ info [ml/bm_classification] Loading "data.json.gz"
[00:46:03]                     │ info [ml/bm_classification] Skipped restore for existing index "ft_bank_marketing"
[00:46:04]                     │ debg Searching for 'index-pattern' with title 'ft_bank_marketing'...
[00:46:04]                     │ debg  > Found 'ce9abca0-7103-11eb-86ab-a9e96abdbe75'
[00:46:04]                     │ debg Index pattern with title 'ft_bank_marketing' already exists. Nothing to create.
[00:46:04]                     │ debg Creating data frame analytic job with id 'bm_1_1613551984853' ...
[00:46:04]                     │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-config] creating index, cause [auto(bulk api)], templates [], shards [1]/[1]
[00:46:04]                     │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] updating number_of_replicas to [0] for indices [.ml-config]
[00:46:04]                     │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-annotations-6] creating index, cause [api], templates [], shards [1]/[1]
[00:46:04]                     │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] updating number_of_replicas to [0] for indices [.ml-annotations-6]
[00:46:04]                     │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-notifications-000001] creating index, cause [auto(bulk api)], templates [.ml-notifications-000001], shards [1]/[1]
[00:46:04]                     │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] updating number_of_replicas to [0] for indices [.ml-notifications-000001]
[00:46:05]                     │ debg Waiting up to 5000ms for 'bm_1_1613551984853' to exist...
[00:46:05]                     │ debg Fetching data frame analytics job 'bm_1_1613551984853'...
[00:46:05]                     │ debg > DFA job fetched.
[00:46:05]                     │ debg > DFA job created.
[00:46:05]                     │ debg navigating to ml url: http://localhost:6191/app/ml
[00:46:05]                     │ debg navigate to: http://localhost:6191/app/ml
[00:46:05]                     │ debg browser[INFO] http://localhost:6191/app/ml?_t=1613554750887 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:46:05]                     │
[00:46:05]                     │ debg browser[INFO] http://localhost:6191/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:46:05]                     │ debg ... sleep(700) start
[00:46:05]                     │ debg ... sleep(700) end
[00:46:05]                     │ debg returned from get, calling refresh
[00:46:06]                     │ERROR browser[SEVERE] http://localhost:6191/39917/bundles/core/core.entry.js 12:158404 TypeError: Failed to fetch
[00:46:06]                     │          at fetch_Fetch.fetchResponse (http://localhost:6191/39917/bundles/core/core.entry.js:6:32451)
[00:46:06]                     │          at async interceptResponse (http://localhost:6191/39917/bundles/core/core.entry.js:6:28637)
[00:46:06]                     │          at async http://localhost:6191/39917/bundles/core/core.entry.js:6:31117
[00:46:06]                     │ debg browser[INFO] http://localhost:6191/app/ml?_t=1613554750887 341 Refused to execute inline script because it violates the following Content Security Policy directive: "script-src 'unsafe-eval' 'self'". Either the 'unsafe-inline' keyword, a hash ('sha256-P5polb1UreUSOe5V/Pv7tc+yeZuJXiOi/3fqhGsU7BE='), or a nonce ('nonce-...') is required to enable inline execution.
[00:46:06]                     │
[00:46:06]                     │ debg browser[INFO] http://localhost:6191/bootstrap.js 42:19 "^ A single error about an inline script not firing due to content security policy is expected!"
[00:46:07]                     │ debg currentUrl = http://localhost:6191/app/ml
[00:46:07]                     │          appUrl = http://localhost:6191/app/ml
[00:46:07]                     │ debg TestSubjects.find(kibanaChrome)
[00:46:07]                     │ debg Find.findByCssSelector('[data-test-subj="kibanaChrome"]') with timeout=60000
[00:46:07]                     │ debg ... sleep(501) start
[00:46:07]                     │ debg ... sleep(501) end
[00:46:07]                     │ debg in navigateTo url = http://localhost:6191/app/ml/overview
[00:46:07]                     │ debg --- retry.try error: URL changed, waiting for it to settle
[00:46:08]                     │ debg ... sleep(501) start
[00:46:08]                     │ debg ... sleep(501) end
[00:46:08]                     │ debg in navigateTo url = http://localhost:6191/app/ml/overview
[00:46:08]                     │ debg TestSubjects.exists(statusPageContainer)
[00:46:08]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="statusPageContainer"]') with timeout=2500
[00:46:11]                     │ debg --- retry.tryForTime error: [data-test-subj="statusPageContainer"] is not displayed
[00:46:11]                     │ debg TestSubjects.exists(mlApp)
[00:46:11]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlApp"]') with timeout=2000
[00:46:11]                     │ debg TestSubjects.click(~mlMainTab & ~dataFrameAnalytics)
[00:46:11]                     │ debg Find.clickByCssSelector('[data-test-subj~="mlMainTab"][data-test-subj~="dataFrameAnalytics"]') with timeout=10000
[00:46:11]                     │ debg Find.findByCssSelector('[data-test-subj~="mlMainTab"][data-test-subj~="dataFrameAnalytics"]') with timeout=10000
[00:46:12]                     │ debg TestSubjects.exists(~mlMainTab & ~dataFrameAnalytics & ~selected)
[00:46:12]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj~="mlMainTab"][data-test-subj~="dataFrameAnalytics"][data-test-subj~="selected"]') with timeout=120000
[00:46:12]                     │ debg TestSubjects.exists(mlPageDataFrameAnalytics)
[00:46:12]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlPageDataFrameAnalytics"]') with timeout=120000
[00:46:12]                     │ debg TestSubjects.exists(~mlAnalyticsTable)
[00:46:12]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj~="mlAnalyticsTable"]') with timeout=60000
[00:46:12]                     │ debg TestSubjects.exists(mlAnalyticsTable loaded)
[00:46:12]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlAnalyticsTable loaded"]') with timeout=30000
[00:46:12]                     │ debg TestSubjects.exists(~mlAnalyticsTable)
[00:46:12]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj~="mlAnalyticsTable"]') with timeout=60000
[00:46:12]                     │ debg TestSubjects.exists(mlAnalyticsTable loaded)
[00:46:12]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlAnalyticsTable loaded"]') with timeout=30000
[00:46:12]                     │ debg TestSubjects.find(mlAnalyticsTableContainer)
[00:46:12]                     │ debg Find.findByCssSelector('[data-test-subj="mlAnalyticsTableContainer"]') with timeout=10000
[00:46:12]                     │ debg TestSubjects.find(mlAnalyticsTableContainer)
[00:46:12]                     │ debg Find.findByCssSelector('[data-test-subj="mlAnalyticsTableContainer"]') with timeout=10000
[00:46:12]                     │ debg TestSubjects.find(~mlAnalyticsTable)
[00:46:12]                     │ debg Find.findByCssSelector('[data-test-subj~="mlAnalyticsTable"]') with timeout=10000
[00:46:12]                     │ debg TestSubjects.exists(mlAnalyticsJobDeleteButton)
[00:46:12]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlAnalyticsJobDeleteButton"]') with timeout=2500
[00:46:15]                     │ debg --- retry.tryForTime error: [data-test-subj="mlAnalyticsJobDeleteButton"] is not displayed
[00:46:15]                     │ debg TestSubjects.click(~mlAnalyticsTable > ~row-bm_1_1613551984853 > euiCollapsedItemActionsButton)
[00:46:15]                     │ debg Find.clickByCssSelector('[data-test-subj~="mlAnalyticsTable"] [data-test-subj~="row-bm_1_1613551984853"] [data-test-subj="euiCollapsedItemActionsButton"]') with timeout=10000
[00:46:15]                     │ debg Find.findByCssSelector('[data-test-subj~="mlAnalyticsTable"] [data-test-subj~="row-bm_1_1613551984853"] [data-test-subj="euiCollapsedItemActionsButton"]') with timeout=10000
[00:46:16]                     │ debg TestSubjects.exists(mlAnalyticsJobDeleteButton)
[00:46:16]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlAnalyticsJobDeleteButton"]') with timeout=5000
[00:46:16]                     │ debg TestSubjects.click(mlAnalyticsJobCloneButton)
[00:46:16]                     │ debg Find.clickByCssSelector('[data-test-subj="mlAnalyticsJobCloneButton"]') with timeout=10000
[00:46:16]                     │ debg Find.findByCssSelector('[data-test-subj="mlAnalyticsJobCloneButton"]') with timeout=10000
[00:46:16]                     │ debg TestSubjects.exists(mlAnalyticsCreationContainer)
[00:46:16]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlAnalyticsCreationContainer"]') with timeout=120000
[00:46:16]                     │ debg browser[INFO] http://localhost:6191/39917/bundles/plugin/ml/ml.chunk.8.js 2:171914 "Property \"early_stopping_enabled\" is unknown."
[00:46:17]                   └-> opens the existing job in the data frame analytics job wizard
[00:46:17]                     └-> "before each" hook: global before each
[00:46:17]                     │ debg === TEST STEP === should open the wizard with a proper header
[00:46:17]                     │ debg TestSubjects.getVisibleText(mlDataFrameAnalyticsWizardHeaderTitle)
[00:46:17]                     │ debg TestSubjects.find(mlDataFrameAnalyticsWizardHeaderTitle)
[00:46:17]                     │ debg Find.findByCssSelector('[data-test-subj="mlDataFrameAnalyticsWizardHeaderTitle"]') with timeout=10000
[00:46:17]                     │ debg TestSubjects.exists(mlAnalyticsCreateJobWizardConfigurationStep active)
[00:46:17]                     │ debg Find.existsByDisplayedByCssSelector('[data-test-subj="mlAnalyticsCreateJobWizardConfigurationStep active"]') with timeout=120000
[00:46:19]                     │ debg --- retry.tryForTime error: [data-test-subj="mlAnalyticsCreateJobWizardConfigurationStep active"] is not displayed
[00:46:22]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:25]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:28]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:31]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:34]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:37]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:40]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:43]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:46]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:49]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:52]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:55]                     │ debg --- retry.tryForTime failed again with the same message...
[00:46:59]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:02]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:05]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:08]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:11]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:14]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:17]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:20]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:23]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:26]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:29]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:32]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:35]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:38]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:41]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:44]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:47]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:50]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:53]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:56]                     │ debg --- retry.tryForTime failed again with the same message...
[00:47:59]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:02]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:05]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:08]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:11]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:14]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:17]                     │ debg --- retry.tryForTime failed again with the same message...
[00:48:18]                     │ info Taking screenshot "/dev/shm/workspace/parallel/9/kibana/x-pack/test/functional/screenshots/failure/machine learning  data frame analytics jobs cloning supported by UI form classification job supported by the form opens the existing job in the data frame analytics job wizard.png"
[00:48:18]                     │ info Current URL is: http://localhost:6191/app/ml/data_frame_analytics/new_job?index=ce9abca0-7103-11eb-86ab-a9e96abdbe75&jobId=bm_1_1613551984853
[00:48:18]                     │ info Saving page source to: /dev/shm/workspace/parallel/9/kibana/x-pack/test/functional/failure_debug/html/machine learning  data frame analytics jobs cloning supported by UI form classification job supported by the form opens the existing job in the data frame analytics job wizard.html
[00:48:18]                     └- ✖ fail: machine learning  data frame analytics jobs cloning supported by UI form classification job supported by the form opens the existing job in the data frame analytics job wizard
[00:48:18]                     │      Error: expected testSubject(mlAnalyticsCreateJobWizardConfigurationStep active) to exist
[00:48:18]                     │       at TestSubjects.existOrFail (/dev/shm/workspace/parallel/9/kibana/test/functional/services/common/test_subjects.ts:51:15)
[00:48:18]                     │       at Object.assertConfigurationStepActive (test/functional/services/ml/data_frame_analytics_creation.ts:316:7)
[00:48:18]                     │       at Context.<anonymous> (test/functional/apps/ml/data_frame_analytics/cloning.ts:166:11)
[00:48:18]                     │       at Object.apply (/dev/shm/workspace/parallel/9/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16)
[00:48:18]                     │ 
[00:48:18]                     │ 

Stack Trace

Error: expected testSubject(mlAnalyticsCreateJobWizardConfigurationStep active) to exist
    at TestSubjects.existOrFail (/dev/shm/workspace/parallel/9/kibana/test/functional/services/common/test_subjects.ts:51:15)
    at Object.assertConfigurationStepActive (test/functional/services/ml/data_frame_analytics_creation.ts:316:7)
    at Context.<anonymous> (test/functional/apps/ml/data_frame_analytics/cloning.ts:166:11)
    at Object.apply (/dev/shm/workspace/parallel/9/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16)

X-Pack EPM API Integration Tests.x-pack/test/fleet_api_integration/apis/fleet_setup·ts.Fleet Endpoints fleet_setup should not create a fleet_enroll role if one does not already exist

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has not failed recently on tracked branches

[00:00:00]       │
[00:00:00]         └-: Fleet Endpoints
[00:00:00]           └-> "before all" hook
[00:00:00]           └-: fleet_setup
[00:00:00]             └-> "before all" hook
[00:00:00]             └-> should not create a fleet_enroll role if one does not already exist
[00:00:00]               └-> "before each" hook: global before each
[00:00:00]               └-> "before each" hook: beforeSetupWithDockerRegistry
[00:00:00]               └-> "before each" hook
[00:00:00]               │ proc [kibana]   log   [09:40:19.202] [info][fleet][plugins] Custom registry url is an experimental feature and is unsupported.
[00:00:00]               │ proc [kibana]   log   [09:40:19.205] [info][fleet][plugins] Custom registry url is an experimental feature and is unsupported.
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35770, url.original: /search?package=system&internal=true&experimental=true
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35766, url.original: /search?package=endpoint&internal=true&experimental=true
[00:00:00]               │ proc [kibana]   log   [09:40:19.236] [info][fleet][plugins] Custom registry url is an experimental feature and is unsupported.
[00:00:00]               │ proc [kibana]   log   [09:40:19.241] [info][fleet][plugins] Custom registry url is an experimental feature and is unsupported.
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35782, url.original: /search?package=system&internal=true&experimental=true
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35780, url.original: /search?package=endpoint&internal=true&experimental=true
[00:00:00]               │ proc [kibana]   log   [09:40:19.246] [info][fleet][plugins] Custom registry url is an experimental feature and is unsupported.
[00:00:00]               │ proc [kibana]   log   [09:40:19.248] [info][fleet][plugins] Custom registry url is an experimental feature and is unsupported.
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35790, url.original: /package/system/0.5.3
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35788, url.original: /package/endpoint/0.13.1
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35796, url.original: /package/endpoint/0.13.1/
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35798, url.original: /package/system/0.5.3/
[00:00:00]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.kibana_1/SDEq2v4OSk-F-In267N2ng] update_mapping [_doc]
[00:00:00]               │ proc [kibana]   log   [09:40:19.370] [info][fleet][plugins] Custom registry url is an experimental feature and is unsupported.
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35810, url.original: /epr/system/system-0.5.3.zip
[00:00:00]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.kibana_1/SDEq2v4OSk-F-In267N2ng] update_mapping [_doc]
[00:00:00]               │ proc [kibana]   log   [09:40:19.409] [info][fleet][plugins] Custom registry url is an experimental feature and is unsupported.
[00:00:00]               │ info [docker:registry] 2021/02/17 09:40:19 source.ip: 172.17.0.1:35816, url.original: /epr/endpoint/endpoint-0.13.1.zip
[00:00:00]               │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ds-ilm-history-5-2021.02.17-000001] creating index, cause [initialize_data_stream], templates [ilm-history], shards [1]/[0]
[00:00:00]               │ info [o.e.c.m.MetadataCreateDataStreamService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding data stream [ilm-history-5] with write index [.ds-ilm-history-5-2021.02.17-000001] and backing indices []
[00:00:00]               │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] moving index [.ds-ilm-history-5-2021.02.17-000001] from [null] to [{"phase":"new","action":"complete","name":"complete"}] in policy [ilm-history-ilm-policy]
[00:00:00]               │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[.ds-ilm-history-5-2021.02.17-000001][0]]])." previous.health="YELLOW" reason="shards started [[.ds-ilm-history-5-2021.02.17-000001][0]]"
[00:00:00]               │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] moving index [.ds-ilm-history-5-2021.02.17-000001] from [{"phase":"new","action":"complete","name":"complete"}] to [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] in policy [ilm-history-ilm-policy]
[00:00:00]               │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] moving index [.ds-ilm-history-5-2021.02.17-000001] from [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] to [{"phase":"hot","action":"rollover","name":"check-rollover-ready"}] in policy [ilm-history-ilm-policy]
[00:00:01]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.kibana_1/SDEq2v4OSk-F-In267N2ng] update_mapping [_doc]
[00:00:02]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.kibana_1/SDEq2v4OSk-F-In267N2ng] update_mapping [_doc]
[00:00:02]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.kibana_1/SDEq2v4OSk-F-In267N2ng] update_mapping [_doc]
[00:00:02]               │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.kibana_1/SDEq2v4OSk-F-In267N2ng] update_mapping [_doc]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [metrics-endpoint.policy-mappings]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [logs-endpoint.alerts-mappings]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [logs-endpoint.library-mappings]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [metrics-endpoint.metadata-mappings]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [logs-endpoint.security-mappings]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [logs-endpoint.network-mappings]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [logs-endpoint.registry-mappings]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [metrics-endpoint.metrics-mappings]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [logs-endpoint.file-mappings]
[00:00:06]               │ info [o.e.c.m.MetadataIndexTemplateService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] adding component template [logs-endpoint.process-mappings]
[00:00:06]               │ proc [kibana]   log   [09:40:25.600] [error][fleet][plugins] uninstalling system-0.5.3 after error installing
[00:00:06]               │ proc [kibana]   log   [09:40:25.610] [error][fleet][plugins] failed to uninstall or rollback package after installation error Error: system is installed by default and cannot be removed
[00:00:06]               │ proc [kibana]   log   [09:40:25.690] [error][fleet][plugins] uninstalling endpoint-0.13.1 after error installing
[00:00:06]               │ proc [kibana]   log   [09:40:25.699] [error][fleet][plugins] failed to uninstall or rollback package after installation error Error: endpoint is installed by default and cannot be removed
[00:00:06]               │ proc [kibana]   log   [09:40:25.701] [error][fleet][plugins] [illegal_argument_exception] composable template [metrics-system.diskio] template after composition is invalid response from /_index_template/metrics-system.diskio: {"error":{"root_cause":[{"type":"illegal_argument_exception","reason":"composable template [metrics-system.diskio] template after composition is invalid"}],"type":"illegal_argument_exception","reason":"composable template [metrics-system.diskio] template after composition is invalid","caused_by":{"type":"illegal_argument_exception","reason":"template [metrics-system.diskio] has alias and data stream definitions"}},"status":400}
[00:00:06]               └- ✖ fail: Fleet Endpoints fleet_setup should not create a fleet_enroll role if one does not already exist
[00:00:06]               │      Error: expected 200 "OK", got 400 "Bad Request"
[00:00:06]               │       at Test._assertStatus (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:268:12)
[00:00:06]               │       at Test._assertFunction (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:283:11)
[00:00:06]               │       at Test.assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:173:18)
[00:00:06]               │       at assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:131:12)
[00:00:06]               │       at /dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:128:5
[00:00:06]               │       at Test.Request.callback (/dev/shm/workspace/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:718:3)
[00:00:06]               │       at /dev/shm/workspace/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:906:18
[00:00:06]               │       at IncomingMessage.<anonymous> (/dev/shm/workspace/kibana/node_modules/supertest/node_modules/superagent/lib/node/parsers/json.js:19:7)
[00:00:06]               │       at endReadableNT (internal/streams/readable.js:1327:12)
[00:00:06]               │       at processTicksAndRejections (internal/process/task_queues.js:80:21)
[00:00:06]               │ 
[00:00:06]               │ 

Stack Trace

Error: expected 200 "OK", got 400 "Bad Request"
    at Test._assertStatus (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:268:12)
    at Test._assertFunction (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:283:11)
    at Test.assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:173:18)
    at assert (/dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:131:12)
    at /dev/shm/workspace/kibana/node_modules/supertest/lib/test.js:128:5
    at Test.Request.callback (/dev/shm/workspace/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:718:3)
    at /dev/shm/workspace/kibana/node_modules/supertest/node_modules/superagent/lib/node/index.js:906:18
    at IncomingMessage.<anonymous> (/dev/shm/workspace/kibana/node_modules/supertest/node_modules/superagent/lib/node/parsers/json.js:19:7)
    at endReadableNT (internal/streams/readable.js:1327:12)
    at processTicksAndRejections (internal/process/task_queues.js:80:21)

X-Pack API Integration Tests.x-pack/test/api_integration/apis/ml/modules/setup_module·ts.apis Machine Learning modules module setup sets up module data for logs_ui_categories with prefix, startDatafeed true and estimateModelMemory true

Link to Jenkins

Standard Out

Failed Tests Reporter:
  - Test has failed 1 times on tracked branches: https://dryrun

[00:00:00]       │
[00:00:00]         └-: apis
[00:00:00]           └-> "before all" hook
[00:07:50]           └-: Machine Learning
[00:07:50]             └-> "before all" hook
[00:07:50]             └-> "before all" hook
[00:07:50]               │ debg creating role ft_ml_source
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_source]
[00:07:50]               │ debg creating role ft_ml_source_readonly
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_source_readonly]
[00:07:50]               │ debg creating role ft_ml_dest
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_dest]
[00:07:50]               │ debg creating role ft_ml_dest_readonly
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_dest_readonly]
[00:07:50]               │ debg creating role ft_ml_ui_extras
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_ml_ui_extras]
[00:07:50]               │ debg creating role ft_default_space_ml_all
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space_ml_all]
[00:07:50]               │ debg creating role ft_default_space1_ml_all
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space1_ml_all]
[00:07:50]               │ debg creating role ft_all_spaces_ml_all
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_all_spaces_ml_all]
[00:07:50]               │ debg creating role ft_default_space_ml_read
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space_ml_read]
[00:07:50]               │ debg creating role ft_default_space1_ml_read
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space1_ml_read]
[00:07:50]               │ debg creating role ft_all_spaces_ml_read
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_all_spaces_ml_read]
[00:07:50]               │ debg creating role ft_default_space_ml_none
[00:07:50]               │ info [o.e.x.s.a.r.TransportPutRoleAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added role [ft_default_space_ml_none]
[00:07:50]               │ debg creating user ft_ml_poweruser
[00:07:50]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_poweruser]
[00:07:50]               │ debg created user ft_ml_poweruser
[00:07:50]               │ debg creating user ft_ml_poweruser_spaces
[00:07:50]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_poweruser_spaces]
[00:07:50]               │ debg created user ft_ml_poweruser_spaces
[00:07:50]               │ debg creating user ft_ml_poweruser_space1
[00:07:50]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_poweruser_space1]
[00:07:50]               │ debg created user ft_ml_poweruser_space1
[00:07:50]               │ debg creating user ft_ml_poweruser_all_spaces
[00:07:50]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_poweruser_all_spaces]
[00:07:50]               │ debg created user ft_ml_poweruser_all_spaces
[00:07:50]               │ debg creating user ft_ml_viewer
[00:07:50]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_viewer]
[00:07:50]               │ debg created user ft_ml_viewer
[00:07:50]               │ debg creating user ft_ml_viewer_spaces
[00:07:51]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_viewer_spaces]
[00:07:51]               │ debg created user ft_ml_viewer_spaces
[00:07:51]               │ debg creating user ft_ml_viewer_space1
[00:07:51]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_viewer_space1]
[00:07:51]               │ debg created user ft_ml_viewer_space1
[00:07:51]               │ debg creating user ft_ml_viewer_all_spaces
[00:07:51]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_viewer_all_spaces]
[00:07:51]               │ debg created user ft_ml_viewer_all_spaces
[00:07:51]               │ debg creating user ft_ml_unauthorized
[00:07:51]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_unauthorized]
[00:07:51]               │ debg created user ft_ml_unauthorized
[00:07:51]               │ debg creating user ft_ml_unauthorized_spaces
[00:07:51]               │ info [o.e.x.s.a.u.TransportPutUserAction] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] added user [ft_ml_unauthorized_spaces]
[00:07:51]               │ debg created user ft_ml_unauthorized_spaces
[00:07:51]             └-: modules
[00:07:51]               └-> "before all" hook
[00:08:06]               └-: module setup
[00:08:06]                 └-> "before all" hook
[00:08:06]                 └-> "before all" hook
[00:08:06]                   │ debg applying update to kibana config: {"dateFormat:tz":"UTC"}
[00:08:49]                 └-: sets up module data
[00:08:49]                   └-> "before all" hook
[00:08:49]                   └-> "before all" hook
[00:08:49]                     │ info [ml/module_logs] Loading "mappings.json"
[00:08:49]                     │ info [ml/module_logs] Loading "data.json.gz"
[00:08:49]                     │ info [ml/module_logs] Skipped restore for existing index "ft_module_logs"
[00:08:49]                     │ debg Searching for 'index-pattern' with title 'ft_module_logs'...
[00:08:49]                     │ debg  > Found 'eac98cd0-7103-11eb-8a66-79625f22bf2d'
[00:08:49]                     │ debg Index pattern with title 'ft_module_logs' already exists. Nothing to create.
[00:08:49]                   └-> for logs_ui_categories with prefix, startDatafeed true and estimateModelMemory true
[00:08:49]                     └-> "before each" hook: global before each
[00:08:49]                     │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-anomalies-shared] creating index, cause [api], templates [.ml-anomalies-], shards [1]/[1]
[00:08:49]                     │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] updating number_of_replicas to [0] for indices [.ml-anomalies-shared]
[00:08:49]                     │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-annotations-6] creating index, cause [api], templates [], shards [1]/[1]
[00:08:49]                     │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] updating number_of_replicas to [0] for indices [.ml-annotations-6]
[00:08:50]                     │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-anomalies-shared/O3BCtFeVRFidRVHUOHKCaQ] update_mapping [_doc]
[00:08:50]                     │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-config] creating index, cause [auto(bulk api)], templates [], shards [1]/[1]
[00:08:50]                     │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] updating number_of_replicas to [0] for indices [.ml-config]
[00:08:50]                     │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-notifications-000001] creating index, cause [auto(bulk api)], templates [.ml-notifications-000001], shards [1]/[1]
[00:08:50]                     │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] updating number_of_replicas to [0] for indices [.ml-notifications-000001]
[00:08:52]                     │ info [o.e.x.m.j.p.a.AutodetectProcessManager] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] Opening job [pf7_log-entry-categories-count]
[00:08:52]                     │ info [o.e.x.c.m.u.MlIndexAndAlias] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] About to create first concrete index [.ml-state-000001] with alias [.ml-state-write]
[00:08:52]                     │ info [o.e.c.m.MetadataCreateIndexService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-state-000001] creating index, cause [api], templates [.ml-state], shards [1]/[1]
[00:08:52]                     │ info [o.e.c.r.a.AllocationService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] updating number_of_replicas to [0] for indices [.ml-state-000001]
[00:08:52]                     │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] moving index [.ml-state-000001] from [null] to [{"phase":"new","action":"complete","name":"complete"}] in policy [ml-size-based-ilm-policy]
[00:08:52]                     │ info [o.e.x.m.j.p.a.AutodetectProcessManager] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [pf7_log-entry-categories-count] Loading model snapshot [N/A], job latest_record_timestamp [N/A]
[00:08:52]                     │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] moving index [.ml-state-000001] from [{"phase":"new","action":"complete","name":"complete"}] to [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] in policy [ml-size-based-ilm-policy]
[00:08:52]                     │ info [o.e.x.i.IndexLifecycleTransition] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] moving index [.ml-state-000001] from [{"phase":"hot","action":"unfollow","name":"branch-check-unfollow-prerequisites"}] to [{"phase":"hot","action":"rollover","name":"check-rollover-ready"}] in policy [ml-size-based-ilm-policy]
[00:08:52]                     │ info [o.e.x.m.p.l.CppLogMessageHandler] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [pf7_log-entry-categories-count] [autodetect/253094] [CResourceMonitor.cc@77] Setting model memory limit to 41 MB
[00:08:52]                     │ info [o.e.x.m.j.p.a.AutodetectProcessManager] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] Successfully set job state to [opened] for job [pf7_log-entry-categories-count]
[00:08:52]                     │ info [o.e.x.m.d.DatafeedJob] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [pf7_log-entry-categories-count] Datafeed started (from: 1970-01-01T00:00:00.000Z to: 2021-02-17T09:29:58.998Z) with frequency [450000ms]
[00:08:52]                     │ debg Waiting up to 5000ms for 'pf7_log-entry-categories-count' to exist...
[00:08:52]                     │ debg Waiting up to 5000ms for 'datafeed-pf7_log-entry-categories-count' to exist...
[00:08:52]                     │ debg Waiting up to 10000ms for 'pf7_log-entry-categories-count' to have processed_record_count > 0...
[00:08:52]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:52]                     │ debg > AD job stats fetched.
[00:08:52]                     │ debg --- retry.waitForWithTimeout error: expected anomaly detection job 'pf7_log-entry-categories-count' to have processed_record_count > 0 (got 0)
[00:08:52]                     │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-anomalies-shared/O3BCtFeVRFidRVHUOHKCaQ] update_mapping [_doc]
[00:08:53]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:53]                     │ debg > AD job stats fetched.
[00:08:53]                     │ debg Waiting up to 120000ms for job state to be closed...
[00:08:53]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:53]                     │ debg > AD job stats fetched.
[00:08:53]                     │ debg --- retry.waitForWithTimeout error: expected job state to be closed but got opened
[00:08:53]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:53]                     │ debg > AD job stats fetched.
[00:08:53]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:54]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:54]                     │ debg > AD job stats fetched.
[00:08:54]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:54]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:54]                     │ debg > AD job stats fetched.
[00:08:54]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:55]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:55]                     │ debg > AD job stats fetched.
[00:08:55]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:55]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:55]                     │ debg > AD job stats fetched.
[00:08:55]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:56]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:56]                     │ debg > AD job stats fetched.
[00:08:56]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:56]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:56]                     │ debg > AD job stats fetched.
[00:08:56]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:57]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:57]                     │ debg > AD job stats fetched.
[00:08:57]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:57]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:57]                     │ debg > AD job stats fetched.
[00:08:57]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:58]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:58]                     │ debg > AD job stats fetched.
[00:08:58]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:58]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:58]                     │ debg > AD job stats fetched.
[00:08:58]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:59]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:59]                     │ debg > AD job stats fetched.
[00:08:59]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:08:59]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:08:59]                     │ debg > AD job stats fetched.
[00:08:59]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:00]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:00]                     │ debg > AD job stats fetched.
[00:09:00]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:00]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:00]                     │ debg > AD job stats fetched.
[00:09:00]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:01]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:01]                     │ debg > AD job stats fetched.
[00:09:01]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:01]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:01]                     │ debg > AD job stats fetched.
[00:09:01]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:02]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:02]                     │ debg > AD job stats fetched.
[00:09:02]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:02]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:02]                     │ debg > AD job stats fetched.
[00:09:02]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:03]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:03]                     │ debg > AD job stats fetched.
[00:09:03]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:03]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:03]                     │ debg > AD job stats fetched.
[00:09:03]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:04]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:04]                     │ debg > AD job stats fetched.
[00:09:04]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:04]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:04]                     │ debg > AD job stats fetched.
[00:09:04]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:05]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:05]                     │ debg > AD job stats fetched.
[00:09:05]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:06]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:06]                     │ debg > AD job stats fetched.
[00:09:06]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:06]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:06]                     │ debg > AD job stats fetched.
[00:09:06]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:07]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:07]                     │ debg > AD job stats fetched.
[00:09:07]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:07]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:07]                     │ debg > AD job stats fetched.
[00:09:07]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:08]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:08]                     │ debg > AD job stats fetched.
[00:09:08]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:08]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:08]                     │ debg > AD job stats fetched.
[00:09:08]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:09]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:09]                     │ debg > AD job stats fetched.
[00:09:09]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:09]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:09]                     │ debg > AD job stats fetched.
[00:09:09]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:10]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:10]                     │ debg > AD job stats fetched.
[00:09:10]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:10]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:10]                     │ debg > AD job stats fetched.
[00:09:10]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:10]                     │ info [o.e.x.m.d.DatafeedJob] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [pf7_log-entry-categories-count] Lookback has finished
[00:09:10]                     │ info [o.e.x.m.d.DatafeedManager] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [no_realtime] attempt to stop datafeed [datafeed-pf7_log-entry-categories-count] for job [pf7_log-entry-categories-count]
[00:09:10]                     │ info [o.e.x.m.d.DatafeedManager] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [no_realtime] try lock [20s] to stop datafeed [datafeed-pf7_log-entry-categories-count] for job [pf7_log-entry-categories-count]...
[00:09:10]                     │ info [o.e.x.m.d.DatafeedManager] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [no_realtime] stopping datafeed [datafeed-pf7_log-entry-categories-count] for job [pf7_log-entry-categories-count], acquired [true]...
[00:09:10]                     │ info [o.e.x.m.d.DatafeedManager] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [no_realtime] datafeed [datafeed-pf7_log-entry-categories-count] for job [pf7_log-entry-categories-count] has been stopped
[00:09:10]                     │ info [o.e.x.m.j.p.a.AutodetectProcessManager] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] Closing job [pf7_log-entry-categories-count], because [close job (api)]
[00:09:10]                     │ info [o.e.x.m.p.l.CppLogMessageHandler] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [pf7_log-entry-categories-count] [autodetect/253094] [CCmdSkeleton.cc@61] Handled 584 records
[00:09:10]                     │ info [o.e.x.m.p.l.CppLogMessageHandler] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [pf7_log-entry-categories-count] [autodetect/253094] [CAnomalyJob.cc@1569] Pruning all models
[00:09:10]                     │ info [o.e.c.m.MetadataMappingService] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [.ml-anomalies-shared/O3BCtFeVRFidRVHUOHKCaQ] update_mapping [_doc]
[00:09:11]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:11]                     │ debg > AD job stats fetched.
[00:09:11]                     │ debg --- retry.waitForWithTimeout error: expected job state to be closed but got closing
[00:09:11]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:11]                     │ debg > AD job stats fetched.
[00:09:11]                     │ debg --- retry.waitForWithTimeout failed again with the same message...
[00:09:11]                     │ info [o.e.x.m.p.AbstractNativeProcess] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [pf7_log-entry-categories-count] State output finished
[00:09:11]                     │ info [o.e.x.m.j.p.a.o.AutodetectResultProcessor] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [pf7_log-entry-categories-count] 2342 buckets parsed from autodetect output
[00:09:11]                     │ info [o.e.x.m.j.p.a.AutodetectCommunicator] [kibana-ci-immutable-ubuntu-18-tests-xxl-1613550038603918634] [pf7_log-entry-categories-count] job closed
[00:09:12]                     │ debg Fetching anomaly detection job stats for job pf7_log-entry-categories-count...
[00:09:12]                     │ debg > AD job stats fetched.
[00:09:12]                     │ debg Waiting up to 120000ms for datafeed state to be stopped...
[00:09:12]                     │ debg Fetching datafeed state for datafeed datafeed-pf7_log-entry-categories-count
[00:09:12]                     └- ✖ fail: apis Machine Learning modules module setup sets up module data for logs_ui_categories with prefix, startDatafeed true and estimateModelMemory true
[00:09:12]                     │       Error: Expected job model memory limits '[{"id":"pf7_log-entry-categories-count","modelMemoryLimit":"26mb"}]' (got '[{"id":"pf7_log-entry-categories-count","modelMemoryLimit":"41mb"}]')
[00:09:12]                     │       + expected - actual
[00:09:12]                     │ 
[00:09:12]                     │        [
[00:09:12]                     │          {
[00:09:12]                     │            "id": "pf7_log-entry-categories-count"
[00:09:12]                     │       -    "modelMemoryLimit": "41mb"
[00:09:12]                     │       +    "modelMemoryLimit": "26mb"
[00:09:12]                     │          }
[00:09:12]                     │        ]
[00:09:12]                     │       
[00:09:12]                     │       at Assertion.assert (/dev/shm/workspace/parallel/21/kibana/packages/kbn-expect/expect.js:100:11)
[00:09:12]                     │       at Assertion.eql (/dev/shm/workspace/parallel/21/kibana/packages/kbn-expect/expect.js:244:8)
[00:09:12]                     │       at Context.<anonymous> (test/api_integration/apis/ml/modules/setup_module.ts:868:46)
[00:09:12]                     │       at Object.apply (/dev/shm/workspace/parallel/21/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16)
[00:09:12]                     │ 
[00:09:12]                     │ 

Stack Trace

Error: Expected job model memory limits '[{"id":"pf7_log-entry-categories-count","modelMemoryLimit":"26mb"}]' (got '[{"id":"pf7_log-entry-categories-count","modelMemoryLimit":"41mb"}]')
    at Assertion.assert (/dev/shm/workspace/parallel/21/kibana/packages/kbn-expect/expect.js:100:11)
    at Assertion.eql (/dev/shm/workspace/parallel/21/kibana/packages/kbn-expect/expect.js:244:8)
    at Context.<anonymous> (test/api_integration/apis/ml/modules/setup_module.ts:868:46)
    at Object.apply (/dev/shm/workspace/parallel/21/kibana/packages/kbn-test/src/functional_test_runner/lib/mocha/wrap_function.js:73:16) {
  actual: '[\n' +
    '  {\n' +
    '    "id": "pf7_log-entry-categories-count"\n' +
    '    "modelMemoryLimit": "41mb"\n' +
    '  }\n' +
    ']',
  expected: '[\n' +
    '  {\n' +
    '    "id": "pf7_log-entry-categories-count"\n' +
    '    "modelMemoryLimit": "26mb"\n' +
    '  }\n' +
    ']',
  showDiff: true
}

and 3 more failures, only showing the first 3.

Metrics [docs]

Module Count

Fewer modules leads to a faster build time

id before after diff
indexLifecycleManagement 157 170 +13

Async chunks

Total size of all lazy-loaded chunks that will be downloaded as the user navigates the app

id before after diff
indexLifecycleManagement 225.5KB 225.4KB -17.0B

Page load bundle

Size of the bundles that are downloaded on every page load. Target size is below 100kb

id before after diff
indexLifecycleManagement 64.2KB 63.9KB -276.0B

History

To update your PR or re-run it, just comment with:
@elasticmachine merge upstream

@yuliacech yuliacech deleted the ilm_phase_blocks branch February 24, 2021 13:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature:ILM release_note:feature Makes this part of the condensed release notes Team:Kibana Management Dev Tools, Index Management, Upgrade Assistant, ILM, Ingest Node Pipelines, and more v7.12.0 v8.0.0
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants