Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add tests for python code in documentation #3394

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

Sirusblk
Copy link

Description

Add tests for python code in documentation. Adds a new dev dependency: pytest-examples and with that package is a single parameterized pytest in tests/docs/test_docs.py which will automatically search the docs folder for any python code examples in

Types of Changes

  • Core
  • Bugfix
  • New feature
  • Enhancement/optimization
  • Documentation

Issues Fixed or Closed by This PR

Checklist

  • My code follows the code style of this project.
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.
  • I have read the CONTRIBUTING document.
  • I have added tests to cover my changes.
  • I have tested the changes and verified that they work and don't break anything (as well as I can manage).

Copy link
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Failed to review pull request #3394 due to the following error:

<class 'KeyError'>: 0

Try re-running the review or contact support@sourcery.ai for help.

@botberry
Copy link
Member

botberry commented Feb 24, 2024

Thanks for adding the RELEASE.md file!

Here's a preview of the changelog:


Adds documentation tests for python code blocks in any markdown files in the docs folder. The tests are marked as flaky and allowed to fail. When testing the examples, imports for strawberry or the typing packages are automatically included.

Here's the tweet text:

🆕 Release (next) is out! Thanks to David McLaren for the PR 👏

Get it here 👉 https://strawberry.rocks/release/(next)

@botberry
Copy link
Member

Thanks for adding the RELEASE.md file!

Here's a preview of the changelog:


Adds documentation tests for python code blocks in any markdown files in the docs folder. The tests are marked as flaky and allowed to fail. When testing the examples, imports for strawberry or the typing packages are automatically included.

Here's the tweet text:

🆕 Release (next) is out! Thanks to David McLaren for the PR 👏

Get it here 👉 https://beta.strawberry.rocks/release/(next)

Copy link

codecov bot commented Feb 24, 2024

Codecov Report

Merging #3394 (f6d8c71) into main (0c5bc4b) will decrease coverage by 2.10%.
The diff coverage is 83.33%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3394      +/-   ##
==========================================
- Coverage   96.47%   94.38%   -2.10%     
==========================================
  Files         509      506       -3     
  Lines       32691    31608    -1083     
  Branches     5399     3644    -1755     
==========================================
- Hits        31539    29833    -1706     
- Misses        941     1509     +568     
- Partials      211      266      +55     

Copy link

codspeed-hq bot commented Feb 24, 2024

CodSpeed Performance Report

Merging #3394 will not alter performance

Comparing Sirusblk:add-doc-tests (f6d8c71) with main (0c5bc4b)

Summary

✅ 12 untouched benchmarks

Copy link
Member

@DoctorJohn DoctorJohn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome proposal @Sirusblk! Your changes look good to me, I cross-checked them with the pytest-examples docs.

The CI failures seem to be present on the main branch as well. I'll approve this PR, but will wait for our CI to go green again before merging :)

@patrick91
Copy link
Member

patrick91 commented Feb 27, 2024

This is super cool! I like how you can see what's failing when running the tests.

There's a few things I'd like to think about more:

  1. We need way to test python+schema code snippets.

For this I think we might have to change how pytest-example finds code snippets (or we change how we define them, maybe by doing something like #python+schema to prevent pytest-example from picking this snippet) and then we add our own testing plugin for them.

1a. And maybe optionally also test the graphql+response code blocks

(at least to check if the response matches the query shape)

Maybe we do something like this here?

  1. We also need to figure out how to test the examples, prints might be good, but I think they make the snippets quite verbose, but I really like that pytest-examples has a way to print them. We could also do something custom custom for testing schemas, like here:

Also I think it is worth spending some time to make this a test that doesn't fail. We could introduce this for just one file (maybe the getting started) or we could try to fix as many things as possible.

@Sirusblk
Copy link
Author

I'll see if I can find a solution to making sure the schema matches the python code snippet. I'll also look into seeing if we can test the response as well.

  1. We also need to figure out how to test the examples, prints might be good, but I think they make the snippets quite verbose, but I really like that pytest-examples has a way to print them. We could also do something custom custom for testing schemas, like here: ...

pytest-examples docs do show we can handle prints like so:

print(1 + 2)
#> 3

If you like this I can go ahead and try this out where appropriate. If the syntax doesn't look good we could try assert statements or roll our own syntax like you suggested.

We do reuse a lot of the same structures we could inject some prebuilt data structures into the module namespace when testing to avoid repeat code in each example snippet. For testing schemas, we might be able to similarly use the extracted code examples and be able to compare afterward. I'll poke around and see.

Also I think it is worth spending some time to make this a test that doesn't fail. We could introduce this for just one file (maybe the getting started) or we could try to fix as many things as possible.

I marked them as flaky so it wouldn't block pytest runs but I can remove that and limiting the scope to possibly a file or two with the documentation updated to get started I figured trying to rewrite all the examples all at once might be too much for a single MR. I could tackle this with a string of MRs once we have a solid base to work from.

Thanks for the feedback!

@patrick91
Copy link
Member

I think for now it would be fine to just go with this:

print(1 + 2)
#> 3

we can make it better in future, I'm afraid I'd do to much bike shedding :D

For the imports, I think it's fine to leave them in, I'll test if we can hide them from the docs side 😊 maybe we do a flag like this:

import strawberry

@strawerry.type
class P:
    a: str

@Sirusblk
Copy link
Author

@patrick91 I figured out a way to test python+schema code snippets. I disabled the flaky flag and got all the general docs working. I added a way to keep the global namespace around for code snippets in the same doc file but it relies on global variables which I don't love.

I'm still trying to find a good way to handle graphql+response code snippets.

@Sirusblk
Copy link
Author

Never mind I adapted some of the pytest_examples module code to work with other code snippet types. I added another file to demonstrate those checks. It caught some good minor issues already.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants