Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix jenkins cobertura coverage #2346

Closed
paulmelnikow opened this issue Nov 17, 2018 · 9 comments
Closed

Fix jenkins cobertura coverage #2346

paulmelnikow opened this issue Nov 17, 2018 · 9 comments
Labels
bug Bugs in badges and the frontend good first issue New contributors, join in! keep-service-tests-green Related to fixing failing tests of the services service-badge Accepted and actionable changes, features, and bugs

Comments

@paulmelnikow
Copy link
Member

This may be a legitimate failure. If nothing else, it should prompt a change to the error handling.

  12) JenkinsCoverage
       cobertura: valid coverage
         
	[ GET /c/https/updates.jenkins-ci.org/job/hello-project/job/master.json ]:

      AssertionError: expected { Object (name, value) } to deeply equal { name: 'coverage', value: '64%' }
      + expected - actual

       {
         "name": "coverage"
      -  "value": "invalid response data"
      +  "value": "64%"
       }
      
      at Object.pathMatch.matchJSON (node_modules/icedfrisby/lib/pathMatch.js:138:38)
      at _expect (node_modules/icedfrisby/lib/icedfrisby.js:583:10)
      at IcedFrisbyNock._invokeExpects (node_modules/icedfrisby/lib/icedfrisby.js:1261:26)
      at start (node_modules/icedfrisby/lib/icedfrisby.js:1244:12)
      at Request.runCallback [as _callback] (node_modules/icedfrisby/lib/icedfrisby.js:1131:16)
      at Request.self.callback (node_modules/request/request.js:185:22)
      at Request.<anonymous> (node_modules/request/request.js:1161:10)
      at IncomingMessage.<anonymous> (node_modules/request/request.js:1083:12)
      at endReadableNT (_stream_readable.js:1064:12)
      at _combinedTickCallback (internal/process/next_tick.js:139:11)
      at process._tickDomainCallback (internal/process/next_tick.js:219:9)

  13) JenkinsCoverage
       cobertura: valid coverage (badge URL without leading /job after Jenkins host)
         
	[ GET /c/https/updates.jenkins-ci.org/hello-project/job/master.json ]:

      AssertionError: expected { Object (name, value) } to deeply equal { name: 'coverage', value: '64%' }
      + expected - actual

       {
         "name": "coverage"
      -  "value": "invalid response data"
      +  "value": "64%"
       }
      
      at Object.pathMatch.matchJSON (node_modules/icedfrisby/lib/pathMatch.js:138:38)
      at _expect (node_modules/icedfrisby/lib/icedfrisby.js:583:10)
      at IcedFrisbyNock._invokeExpects (node_modules/icedfrisby/lib/icedfrisby.js:1261:26)
      at start (node_modules/icedfrisby/lib/icedfrisby.js:1244:12)
      at Request.runCallback [as _callback] (node_modules/icedfrisby/lib/icedfrisby.js:1131:16)
      at Request.self.callback (node_modules/request/request.js:185:22)
      at Request.<anonymous> (node_modules/request/request.js:1161:10)
      at IncomingMessage.<anonymous> (node_modules/request/request.js:1083:12)
      at endReadableNT (_stream_readable.js:1064:12)
      at _combinedTickCallback (internal/process/next_tick.js:139:11)
      at process._tickDomainCallback (internal/process/next_tick.js:219:9)

Ref: #1359

@paulmelnikow paulmelnikow added bug Bugs in badges and the frontend good first issue New contributors, join in! service-badge Accepted and actionable changes, features, and bugs keep-service-tests-green Related to fixing failing tests of the services labels Nov 17, 2018
@calebcartwright
Copy link
Member

So one thing I've found is that if I remove the Conditionals coverage object from the elements array in the mocked reply object that these two tests will start passing again.

I.e. the current response fails

.reply(200, {
  results: {
    elements: [
      {
        name: 'Conditionals',
        ratio: 95.146,
      },
      {
        name: 'Lines',
        ratio: 63.745,
      },
    ],
  },
})

but the same response object with just the one Lines object in elements will pass:

.reply(200, {
  results: {
    elements: [
      {
        name: 'Lines',
        ratio: 63.745,
      },
    ],
  },
})

@calebcartwright
Copy link
Member

calebcartwright commented Dec 5, 2018

When I hit a live instance of the API with the same query params Shields uses, I get the following JSON response with various objects in the elements array:

{  
   "_class":"hudson.plugins.cobertura.targets.CoverageResult",
   "results":{  
      "elements":[  
         {  
            "name":"Packages",
            "ratio":92.47312
         },
         {  
            "name":"Files",
            "ratio":90.55556
         },
         {  
            "name":"Classes",
            "ratio":90.94693
         },
         {  
            "name":"Methods",
            "ratio":77.76264
         },
         {  
            "name":"Lines",
            "ratio":74.3159
         },
         {  
            "name":"Conditionals",
            "ratio":65.21124
         }
      ]
   }
}

And i also see that the joi schema for the cobertura response here has name: 'Lines'. I don't have a ton of experience with joi, so should that line filter the objects in elements to exclude those with a name other than Lines? Or does that schema expect the name for each object to be Lines?

@paulmelnikow
Copy link
Member Author

Ooh, thanks for looking into this!

I see. Yea, array.items() wants each item to match the schema.

I think the thing to do is change that name: 'Lines' to name: Joi.string().required(). We could chain on an array.has(schema) to make sure name: 'Lines' for one of the elements.

Then we can filter the results in the handler.

@calebcartwright
Copy link
Member

That's what I was thinking as well. I'll happily take this one and implement that change

@calebcartwright
Copy link
Member

I didn't want to open a new issue for this (yet) but as an FYI..

The live test I added failed in the most recent daily build but i'm positive this was simply due to the test taking longer than the mocha timeout period of 5 seconds. I've seen this a couple times when I run the test locally where it occasionally takes 5-7 seconds.

I'll try to keep an eye on this test and if it keeps intermittently failing I can try to find a different live jenkins endpoint that will hopefully respond more quickly, unless there's a possibility of increasing the mocha timeout period for the service tests in the daily build? I think this particular test would be fine as is with a timeout period of 7500ms vs. the current 5000ms

(5) JenkinsCoverage
       cobertura: job found
         
	[ GET /c/https/builds.apache.org/job/olingo-odata4-cobertura.json ]:
     Error: Request timed out after 5000 ms
      at Timeout.handleRequestTimeout [as _onTimeout] (node_modules/icedfrisby/lib/icedfrisby.js:1098:35)

@PyvesB
Copy link
Member

PyvesB commented Dec 8, 2018

@calebcartwright you can increase the timeout on a per test basis. Simply add .timeout(10000) as we do in the F-Droid service for example.

@calebcartwright
Copy link
Member

Excellent! Semi-random question for you all.. do you care if I add on a change like this to a different, but existing, PR I already have open (say #2477 ), or do you prefer separate PRs for different changes?

@PyvesB
Copy link
Member

PyvesB commented Dec 8, 2018

I would recommend opening a new PR, as the currently opened one is for an unrelated service. 😉

@calebcartwright
Copy link
Member

Gotcha, thanks @PyvesB!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Bugs in badges and the frontend good first issue New contributors, join in! keep-service-tests-green Related to fixing failing tests of the services service-badge Accepted and actionable changes, features, and bugs
Projects
None yet
Development

No branches or pull requests

3 participants