Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(🎁) Improve usability of --install-types #10600

Open
KotlinIsland opened this issue Jun 8, 2021 · 36 comments
Open

(🎁) Improve usability of --install-types #10600

KotlinIsland opened this issue Jun 8, 2021 · 36 comments
Labels
feature meta Issues tracking a broad area of work topic-usability

Comments

@KotlinIsland
Copy link
Contributor

Feature

mypy --install-types requirements.txt will install type stubs of all dependencies in the file.

Pitch

I can't see a generic way to set up an environment ahead of time.

@JelleZijlstra
Copy link
Member

Another useful addition could be to make mypy emit the libraries it would install, so you can do something like mypy --types-to-be-installed >> requirements.txt.

@mikepurvis
Copy link

mikepurvis commented Jun 9, 2021

Just to add to this, the --install-types flag only seems to work after there was a previous run to populate the mypy cache. This leads to the unfortunate situation in CI of having to basically do:

yes | mypy src --install-types || true
mypy src

Which seems super gross and leads to a jumbled and confusing console output with errors followed by a clean run.

IMO if --install-types doesn't have a cache to go on, and can't discover the needed types using the package dependencies or a requirements file, it would be great to have a CI-friendly mode where it can run only the dependency-tracing and not emit any errors (and not need a y prompt to pip!).

@JukkaL
Copy link
Collaborator

JukkaL commented Jun 9, 2021

I wouldn't recommend running --install-types as it currently works in CI, since in the worst case it can almost double the mypy runtime. It also produces noisy output, as mentioned above.

Right now this is possible (as a one-time thing -- commit the changes to requirements.txt):

mypy --install-types
pip freeze | grep '^types[-]' >> requirements.txt

This isn't very intuitive, however. I can see how some projects would also prefer not to maintain stub requirements explicitly and are happy to use the latest versions of all available/known stub packages.

I'm trying to summarize the ideas above below. I can see several somewhat different but related use cases. The option names are open to bike shedding.

Use case 1: Use requirements.txt

Infer types from requirements.txt instead from import dependencies. This could by supported via mypy --install-types -r requirements.txt, for example (or --requirement requirements.txt). This would still produce an interactive prompt by default.

This could be used both as a one-off action and as part of every CI run. This would always install the latest stubs.

Example:

$ cat requirements.txt
requests==x.y.z
$ mypy --install-types -r requirements.txt
Installing stub packages:
python3 -m pip install types-requests

Install? [yN]

Use case 2: Install types non-interactively

Unconditionally install type packages and don't ask for confirmation. If type checking (i.e. not using requirements.txt), this could also silence normal error output about missing stubs. This could be be supported via mypy --install-types --non-interactive, for example, possibly together with -r requirements.txt.

Use case 3: Generate requirements output

Instead of installing stubs, produce output suitable for requirements.txt. This could be supported via mypy --types-requirements. This option would imply --non-interactive. This would also support -r.

Here's how this could look like (note no error output about missing stubs):

$ mypy --types-requirements src/
types-emoji>=0.1.0
types-requests>=0.1.0

I'm not sure whether this should look up the latest versions of stub packages and use types-<foo>==<latest_version>. We shouldn't perhaps include any type packages that are already installed.

@JukkaL JukkaL changed the title add ability to --install-types from a requirements file Improve usability of --install-types Jun 9, 2021
@dcfranca
Copy link

dcfranca commented Jun 9, 2021

Is it possible to run mypy install-types on a CI environment? I have tried, but it gets stuck on the interactive part... is there workaround available or we will have to wait for the --non-interactive flag to be implemented?

@mikepurvis
Copy link

mikepurvis commented Jun 9, 2021

In the absence of a recommendation to use --install-types in CI, is the suggestion to add the types packages to your main requirements.txt? Or a separate mypy-requirements.txt or types-requirements.txt? Or do you make them setuptools install requirements? Or add them to an extra_requires like types?

It'd be great to have some guidance from the project on this.

@dcfranca
Copy link

dcfranca commented Jun 9, 2021

@mikepurvis we have added to the requirements.txt, just wondering if there is other solution to not have to do this on all our projects.

@JukkaL
Copy link
Collaborator

JukkaL commented Jun 9, 2021

My recommendation is to do one of these (each has different tradeoffs):

  1. Add type dependencies to your main requirements.txt
  2. Add type dependencies to a separate mypy-requirements.txt or types-requirements.txt (this is preferred if you already have a dedicated requirements file for mypy)
  3. Create a types-requirements.txt file and use -r types-requirements.txt to include it in another requirements file (or many files across multiple projects)

The --non-interactive flag should be easy to implement. Since this seems to impact a lot of projects, I'm leaning towards making a 0.910 release with the --non-interactive flag within the next week or so.

If/when 0.910 is out, you'd also have the option of running something like mypy --install-types --non-interactive src/ ... in your CI scripts before you invoke mypy to actually type check your code. This would simplify the maintenance of your dependencies, as you'd always get the latest stubs, but it would slow down your CI at least a little since you'd need to run mypy twice. Also your build could start failing because of changes to stubs, but if you are already not pinning to a particular mypy version you probably don't care about this much.

Please let me know if none of the above options work for you. Generating type requirements from your main requirements file (use case 1 above) is more effort to implement, so we may not have it available soon, unless somebody would like to contribute it.

@rafaellehmkuhl
Copy link

The --non-interactive seems that would solve most the problems.

@j616
Copy link

j616 commented Jun 9, 2021

I agree. Non-interactive would return to something along the lines of the original behaviour, admittedly with the interface change in the form of the new flags, and would work for my team's workflows. We've currently had to pin back mypy on all of our repos because the lack of a non-interactive mode. It's broken CI for us.

@chorner
Copy link

chorner commented Jun 9, 2021

I'd like to have some confidence that the package matches the type definition i'm downloading. Does typeshed not provide a mechanism for mapping package versions to type package versions?

With version pinning, it seems inevitable that if it isn't automated the versions will one day fall out of sync... crippling the value of type checking. So i'd say making it non-interactive isn't enough, it also needs to generate the version of the type packages it is going to install.

JukkaL added a commit that referenced this issue Jun 10, 2021
It also doesn't show errors, making it useful for running CI jobs.

Work on #10600.
JukkaL added a commit that referenced this issue Jun 10, 2021
It also doesn't show errors, making it useful in CI jobs.

Example:
```
$ mypy --install-types --non-interactive -c 'import click'
Installing missing stub packages:
/Users/jukka/venv/mypy/bin/python3 -m pip install types-click

Collecting types-click
  Using cached types_click-7.1.0-py2.py3-none-any.whl (13 kB)
Installing collected packages: types-click
Successfully installed types-click-7.1.0
```

Work on #10600.
@JukkaL
Copy link
Collaborator

JukkaL commented Jun 10, 2021

--non-interactive is now supported on git master. It would be great if some of you could try it out before I make the 0.910 release.

@chorner Typeshed supports defining the target library version that stubs support (in METADATA.toml). It's reflected in the version of the types package on PyPI. Since it hasn't been filled in for most stubs, it's not very useful yet. This is still better than what we used to have before, as previously there was no support for specifying the supported package versions.

Once typeshed has more dependable version information, at least the proposed variant of --install-types that looks at target packages in a requirements file could be able to make sure the installed type package is compatible with the installed version of the package.

For example, if we have versions 1.5.2 and 2.0.4 of types-foobar on PyPI, and you have foobar==1.4.2 in your requirements.txt, mypy --install-types -r requirements.txt would install version 1.5.2 of types-foobar. But if you have foobar>=2.0, we'd install version 2.0.4.

@mikepurvis
Copy link

Is the expectation that separate types stub packages are a long term thing? I kind of assumed they were a 12-24 month bridge and the hope in the end is that most popular dependencies would supply their own type information, at least at the API boundaries. I suppose the lesson of Python 3 is not to assume that any hack is a short-term thing.

@JukkaL
Copy link
Collaborator

JukkaL commented Jun 10, 2021

I expect that types stub packages will be around for a long time, but hopefully they will be needed much less frequently in the future. It's not really something we can control, since the decision to bundle stubs or include inline annotations is up to individual project maintainers.

Making the workflows not suck is thus pretty important.

@larroy
Copy link

larroy commented Jun 11, 2021

Seems install-types doesn't work when there's no cache directory:

mypy --install-types
Error: no mypy cache directory (you must enable incremental mode)

@john-bodley
Copy link

john-bodley commented Jun 11, 2021

Somewhat related per https://github.com/pre-commit/mirrors-mypy/issues/50 is there merit in having the --install-types run pre-execution (rather than post per here) so it could be run once (inlining the installation), i.e.,

mypy --install-types --non-interactive program.py

rather than via two-steps which is non-viable when using with pre-commit, i.e.,

mypy --install-types --non-interactive 
mypy program.py

@asottile
Copy link
Contributor

I expect that types stub packages will be around for a long time, but hopefully they will be needed much less frequently in the future

I kinda have the opposite hope -- take for example setting up a separate type checking environment, I'd rather install a handful of text files (~order of KB) than the actual libraries (~order of MB) especially for libraries with native extensions.

inline types also aren't possible for py_modules based distributions given PEP 561 requires folders. so without a PEP improving that separate stubs will be required indefinitely to satisfy that usecase

@RouquinBlanc
Copy link

Same as @larroy , does not work asking for incremental mode, even if explicitly asked (which should be the default mode?)

$ /private/tmp/.tox/mypy/bin/mypy --install-types --non-interactive
Error: no mypy cache directory (you must enable incremental mode)

$ /private/tmp/.tox/mypy/bin/mypy --install-types --non-interactive --incremental
Error: no mypy cache directory (you must enable incremental mode)

@JukkaL
Copy link
Collaborator

JukkaL commented Jun 11, 2021

@larroy @RouquinBlanc Did you actually have some files for mypy to type check? If you just use mypy --install-types without passing any files or directories, mypy will try to use the results of the previous run, which are stored in the cache directory. You can also use files=... in your config file. Alternatively, the directory where you run mypy might be write-only, and mypy can't create cache files.

In any case, the error message is confusing.

@RouquinBlanc
Copy link

Hi @JukkaL,

The use case is running mypy with tox in a container. The mypy section was configured as follows:

[testenv:mypy]
basepython=python3.8
deps=mypy
commands=python -m mypy -p {posargs:mypackage}
skip_install=true

After following this ticket I naively tried to modify it like this:

[testenv:mypy]
basepython=python3.8
deps=mypy
commands=
    python -m mypy --install-types --non-interactive
    python -m mypy -p {posargs: mypackage}
skip_install=true

But as you say, because it has not run yet once, it fails... if I manually call mypy --install-types --non-interactive again afterward, then it works, but that's not a desirable way of working to just start by failing.

For those tests we skip installation, and do not have a requirements.txt to work with, and do not require one at that place for various reasons (not telling we do not have one elsewhere, just that for those test we rely on setup.cfg install_requires, and work with bleeding edge).

A very short term quick-fix is to manually define the list of packages which require external types:

[testenv:mypy]
basepython=python3.8
deps=mypy
commands=
    # TODO replace this
    python3 -m pip install types-PyYAML
    python -m mypy -p {posargs:mypackage}

But that's only a quick fix to me... What if tomorrow another dependency needs external types as well?

In that sense, there are 2 propositions above which would make sense for our scenario:

  • the option of @john-bodley to run it as a one-liner, installing types on the fly
  • or, although less optimum for that setup, the option to pre-generate the list of typing requirements, so that we can install them prior to running mypy as suggested by @JelleZijlstra

@JukkaL
Copy link
Collaborator

JukkaL commented Jun 11, 2021

Hmm it looks like the current behavior still seems somewhat problematic.

What if we'd change --install-types to run the type check again after installing stubs? So mypy --install-types --non-interactive src/ would both install types and produce type checking results, similar to mypy src/ in earlier Python versions.

Currently two runs are needed for the same results:

  1. mypy --install-types --non-interactive src/
  2. mypy src/

LefterisJP added a commit to LefterisJP/rotkehlchen that referenced this issue Jun 12, 2021
Find a better way, this is ugly.

More info and fix coming here: python/mypy#10600
ModernMAK added a commit to MAK-Relic-Tool/Workflows that referenced this issue Sep 28, 2022
--install-types requires CI to run twice to populate a mypy cache
OR 
--install-types requires a special pip freeze action to store type-stub requirements in a requirements.txt 

Both are pretty terrible for CI, and it's being discussed:
python/mypy#10600

Until that's resolved, just ignore missing type stubs via the mypy.ini config file
birthdaysgift pushed a commit to fenya123/forum123 that referenced this issue Jan 6, 2023
We want PRs to have an explicit indicator for linter errors in the branch/PR. It will allow us to see the state of the branch without explicitly running linters locally if we don't want it.

In the scope of this task we need to add a basic CI pipeline with linters and status indicator on GitHub.

Steps to do:
	- add `Dockerfile` for the application (we don't want to add `ENTRYPOINT` in it for now, since we suppose that it is better to have container entrypoints inside the `docker-compose` file)
	- add `docker-compose` file
	Compose file should have following services:
	- `forum123-build` (will be used to build application)
	- `forum123-mypy` (to run `mypy`)
	- `forum123-flake8` (to run `flake8`)
	- `forum123-pylint` (to run `pylint`)
	We want them to be separated on different services to be able to run build in one job, and then run all linters in a parallel way in three different jobs (parallel execution will be implemented sometimes later, now we need only to have different services in compose file).
	- setup CI pipeline using GitHub Actions

Also we want to put this configuration in a separate folder like `envs/dev` to indicate that this is only for development purposes. And when we will need to add some production infrastructure it will go into `envs/prod`.
Seems like using different folders is more convenient than having a bunch of Dockerfile's and docker-compose files with suffixes like `.dev` and `.prod`.

We decided to put mypy stubs in `requirements-dev.txt`, because we had a troubles with installing types on CI. In our case we had to run `mypy` twice - first time, to populate `mypy`'s cache and define what types are missing, and second time to install types and check for errors. The cause of this isssue is that github always run its jobs in a new clean containers, so each new `mypy` run will not have `mypy`'s cache from the previous run. Therefore we had to run `mypy` twice for every job to have type stubs installed. More about related problems with missing type stubs on CI from other people you can read here: python/mypy#10600

Worth to mention that now we're not going to make this pipeline optimized by performance. We are planning to add caching of intermediate docker layers later in the scope of another task. For now we just need this pipeline to work and nothing more.
birthdaysgift pushed a commit to fenya123/forum123 that referenced this issue Jan 6, 2023
We want PRs to have an explicit indicator for linter errors in the branch/PR. It will allow us to see the state of the branch without explicitly running linters locally if we don't want it.

In the scope of this task we need to add a basic CI pipeline with linters and status indicator on GitHub.

Steps to do:

	- add `Dockerfile` for the application (we don't want to add `ENTRYPOINT` in it for now, since we suppose that it is better to have container entrypoints inside the `docker-compose` file)
	- add `docker-compose` file
	Compose file should have following services:
	- `forum123-build` (will be used to build application)
	- `forum123-mypy` (to run `mypy`)
	- `forum123-flake8` (to run `flake8`)
	- `forum123-pylint` (to run `pylint`)
	We want them to be separated on different services to be able to run build in one job, and then run all linters in a parallel way in three different jobs (parallel execution will be implemented sometimes later, now we need only to have different services in compose file).
	- setup CI pipeline using GitHub Actions

Also we want to put this configuration in a separate folder like `envs/dev` to indicate that this is only for development purposes. And when we will need to add some production infrastructure it will go into `envs/prod`.
Seems like using different folders is more convenient than having a bunch of Dockerfile's and docker-compose files with suffixes like `.dev` and `.prod`.

We decided to put mypy stubs in `requirements-dev.txt`, because we had a troubles with installing types on CI. In our case we had to run `mypy` twice - first time, to populate `mypy`'s cache and define what types are missing, and second time to install types and check for errors. The cause of this isssue is that github always run its jobs in a new clean containers, so each new `mypy` run will not have `mypy`'s cache from the previous run. Therefore we had to run `mypy` twice for every job to have type stubs installed. More about related problems with missing type stubs on CI from other people you can read here: python/mypy#10600

Worth to mention that now we're not going to make this pipeline optimized by performance. We are planning to add caching of intermediate docker layers later in the scope of another task. For now we just need this pipeline to work and nothing more.
birthdaysgift pushed a commit to fenya123/forum123 that referenced this issue Jan 6, 2023
We want PRs to have an explicit indicator for linter errors in the branch/PR. It will allow us to see the state of the branch without explicitly running linters locally if we don't want it.

In the scope of this task we need to add a basic CI pipeline with linters and status indicator on GitHub.

Steps to do:

	- add `Dockerfile` for the application (we don't want to add `ENTRYPOINT` in it for now, since we suppose that it is better to have container entrypoints inside the `docker-compose` file)
	- add `docker-compose` file
	Compose file should have following services:
	- `forum123-build` (will be used to build application)
	- `forum123-mypy` (to run `mypy`)
	- `forum123-flake8` (to run `flake8`)
	- `forum123-pylint` (to run `pylint`)
	We want them to be separated on different services to be able to run build in one job, and then run all linters in a parallel way in three different jobs (parallel execution will be implemented sometimes later, now we need only to have different services in compose file).
	- setup CI pipeline using GitHub Actions

Also we want to put this configuration in a separate folder like `envs/dev` to indicate that this is only for development purposes. And when we will need to add some production infrastructure it will go into `envs/prod`.
Seems like using different folders is more convenient than having a bunch of Dockerfile's and docker-compose files with suffixes like `.dev` and `.prod`.

We decided to put mypy stubs in `requirements-dev.txt`, because we had a troubles with installing types on CI. In our case we had to run `mypy` twice - first time, to populate `mypy`'s cache and define what types are missing, and second time to install types and check for errors. The cause of this isssue is that github always run its jobs in a new clean containers, so each new `mypy` run will not have `mypy`'s cache from the previous run. Therefore we had to run `mypy` twice for every job to have type stubs installed. More about related problems with missing type stubs on CI from other people you can read here: python/mypy#10600

Worth to mention that now we're not going to make this pipeline optimized by performance. We are planning to add caching of intermediate docker layers later in the scope of another task. For now we just need this pipeline to work and nothing more.
birthdaysgift pushed a commit to fenya123/forum123 that referenced this issue Jan 6, 2023
We want PRs to have an explicit indicator for linter errors in the branch/PR. It will allow us to see the state of the branch without explicitly running linters locally if we don't want it.

In the scope of this task we need to add a basic CI pipeline with linters and status indicator on GitHub.

Steps to do:

	- add `Dockerfile` for the application (we don't want to add `ENTRYPOINT` in it for now, since we suppose that it is better to have container entrypoints inside the `docker-compose` file)
	- add `docker-compose` file
	Compose file should have following services:
	- `forum123-build` (will be used to build application)
	- `forum123-mypy` (to run `mypy`)
	- `forum123-flake8` (to run `flake8`)
	- `forum123-pylint` (to run `pylint`)
	We want them to be separated on different services to be able to run build in one job, and then run all linters in a parallel way in three different jobs (parallel execution will be implemented sometimes later, now we need only to have different services in compose file).
	- setup CI pipeline using GitHub Actions

Also we want to put this configuration in a separate folder like `envs/dev` to indicate that this is only for development purposes. And when we will need to add some production infrastructure it will go into `envs/prod`.
Seems like using different folders is more convenient than having a bunch of Dockerfile's and docker-compose files with suffixes like `.dev` and `.prod`.

We decided to put mypy stubs in `requirements-dev.txt`, because we had a troubles with installing types on CI. In our case we had to run `mypy` twice - first time, to populate `mypy`'s cache and define what types are missing, and second time to install types and check for errors. The cause of this isssue is that github always run its jobs in a new clean containers, so each new `mypy` run will not have `mypy`'s cache from the previous run. Therefore we had to run `mypy` twice for every job to have type stubs installed. More about related problems with missing type stubs on CI from other people you can read here: python/mypy#10600

Worth to mention that now we're not going to make this pipeline optimized by performance. We are planning to add caching of intermediate docker layers later in the scope of another task. For now we just need this pipeline to work and nothing more.
birthdaysgift pushed a commit to fenya123/forum123 that referenced this issue Jan 6, 2023
We want PRs to have an explicit indicator for linter errors in the branch/PR. It will allow us to see the state of the branch without explicitly running linters locally if we don't want it.

In the scope of this task we need to add a basic CI pipeline with linters and status indicator on GitHub.

Steps to do:

	- add `Dockerfile` for the application (we don't want to add `ENTRYPOINT` in it for now, since we suppose that it is better to have container entrypoints inside the `docker-compose` file)
	- add `docker-compose` file
	Compose file should have following services:
	- `forum123-build` (will be used to build application)
	- `forum123-mypy` (to run `mypy`)
	- `forum123-flake8` (to run `flake8`)
	- `forum123-pylint` (to run `pylint`)
	We want them to be separated on different services to be able to run build in one job, and then run all linters in a parallel way in three different jobs (parallel execution will be implemented sometimes later, now we need only to have different services in compose file).
	- setup CI pipeline using GitHub Actions

Also we want to put this configuration in a separate folder like `envs/dev` to indicate that this is only for development purposes. And when we will need to add some production infrastructure it will go into `envs/prod`.
Seems like using different folders is more convenient than having a bunch of Dockerfile's and docker-compose files with suffixes like `.dev` and `.prod`.

We decided to put mypy stubs in `requirements-dev.txt`, because we had a troubles with installing types on CI. In our case we had to run `mypy` twice - first time, to populate `mypy`'s cache and define what types are missing, and second time to install types and check for errors. The cause of this isssue is that github always run its jobs in a new clean containers, so each new `mypy` run will not have `mypy`'s cache from the previous run. Therefore we had to run `mypy` twice for every job to have type stubs installed. More about related problems with missing type stubs on CI from other people you can read here: python/mypy#10600

Worth to mention that now we're not going to make this pipeline optimized by performance. We are planning to add caching of intermediate docker layers later in the scope of another task. For now we just need this pipeline to work and nothing more.
birthdaysgift pushed a commit to fenya123/forum123 that referenced this issue Jan 6, 2023
We want PRs to have an explicit indicator for linter errors in the branch/PR. It will allow us to see the state of the branch without explicitly running linters locally if we don't want it.

In the scope of this task we need to add a basic CI pipeline with linters and status indicator on GitHub.

Steps to do:

	- add `Dockerfile` for the application (we don't want to add `ENTRYPOINT` in it for now, since we suppose that it is better to have container entrypoints inside the `docker-compose` file)
	- add `docker-compose` file
	Compose file should have following services:
	- `forum123-build` (will be used to build application)
	- `forum123-mypy` (to run `mypy`)
	- `forum123-flake8` (to run `flake8`)
	- `forum123-pylint` (to run `pylint`)
	We want them to be separated on different services to be able to run build in one job, and then run all linters in a parallel way in three different jobs (parallel execution will be implemented sometimes later, now we need only to have different services in compose file).
	- setup CI pipeline using GitHub Actions

Also we want to put this configuration in a separate folder like `envs/dev` to indicate that this is only for development purposes. And when we will need to add some production infrastructure it will go into `envs/prod`.
Seems like using different folders is more convenient than having a bunch of Dockerfile's and docker-compose files with suffixes like `.dev` and `.prod`.

We decided to put mypy stubs in `requirements-dev.txt`, because we had a troubles with installing types on CI. In our case we had to run `mypy` twice - first time, to populate `mypy`'s cache and define what types are missing, and second time to install types and check for errors. The cause of this isssue is that github always run its jobs in a new clean containers, so each new `mypy` run will not have `mypy`'s cache from the previous run. Therefore we had to run `mypy` twice for every job to have type stubs installed. More about related problems with missing type stubs on CI from other people you can read here: python/mypy#10600

Worth to mention that now we're not going to make this pipeline optimized by performance. We are planning to add caching of intermediate docker layers later in the scope of another task. For now we just need this pipeline to work and nothing more.
@JakeSummers
Copy link

Related issue: #14663

@KotlinIsland KotlinIsland changed the title Improve usability of --install-types (🎁) Improve usability of --install-types Feb 9, 2023
swinarga added a commit to NoelJungnickel/flights-co2-tracker that referenced this issue Jun 12, 2023
@FeryET
Copy link

FeryET commented Jul 4, 2023

Is there any way to suppress the "error" or "warning" messages when running mypy --install-types? I want it to install the types in a CI job and don't want it to fail because of mypy errors atm.

@Anton-Constructor
Copy link

Is there any way to suppress the "error" or "warning" messages when running mypy --install-types? I want it to install the types in a CI job and don't want it to fail because of mypy errors atm.

I've the same question. At the moment I use this:
mypy --install-types --non-interactive
But problem it gives an error like this:

error: Can't determine which types to install with no files to check (and no cache from previous mypy run)
Error: Process completed with exit code 2.

If I add the dot in the end, it shows me all the errors in my code from mypy:
mypy --install-types --non-interactive

Have no idea how to install reps, and do not run the mypy checks.

@MichaelBonnet
Copy link

Is there any way to suppress the "error" or "warning" messages when running mypy --install-types? I want it to install the types in a CI job and don't want it to fail because of mypy errors atm.

I've the same question. At the moment I use this: mypy --install-types --non-interactive But problem it gives an error like this:

error: Can't determine which types to install with no files to check (and no cache from previous mypy run)
Error: Process completed with exit code 2.

If I add the dot in the end, it shows me all the errors in my code from mypy: mypy --install-types --non-interactive

Have no idea how to install reps, and do not run the mypy checks.

I am having this same issue. Hard to know what the proper approach is.

@AntoonGa
Copy link

AntoonGa commented Feb 7, 2024

The best I managed to to on my CI is to pre-generate the logs and install the requierements after that.

 # MYPY: checking type def in codebase
 - pip install mypy
 # Run mypy to get stubs requirements. This is because stubs dont ship with packages anymore
 - mypy . || true
 # Extract most stubs from previous mypy log
 - mypy --install-types --non-interactive
  # Run mypy, ignoring missing imports
 - mypy . --ignore-missing-imports

This is slow because mypy needs to run twice, in addition mypy --install-types --non-interactive does not catch all the imports and I have to use the --ignore-missing-imports option.

Is there any better way to implement Mypy in my CI ?

@hauntsaninja
Copy link
Collaborator

As documented in https://mypy.readthedocs.io/en/stable/running_mypy.html#library-stubs-not-installed --install-types can be slow because it effectively runs mypy twice. I recommend just collecting the stubs packages it installs into a requirements.txt file that you check into your CI, and then install that with e.g. pip install -r requirements.txt

Also if you're using latest mypy, I recommend --disable-error-code import-untyped as a slightly safer replacement for --ignore-missing-imports

@Simon-Bertrand
Copy link

As documented in https://mypy.readthedocs.io/en/stable/running_mypy.html#library-stubs-not-installed --install-types can be slow because it effectively runs mypy twice. I recommend just collecting the stubs packages it installs into a requirements.txt file that you check into your CI, and then install that with e.g. pip install -r requirements.txt

Also if you're using latest mypy, I recommend --disable-error-code import-untyped as a slightly safer replacement for --ignore-missing-imports

Hi,
Sorry but I fail to understand something, do we definitively need to add one more file in the project root just to run Mypy in the CI and manage stubs dependencies manually or do we have to run N times Mypy just to get the typed dependencies ?

Maybe I miss something, but it looks very harsh for too few.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature meta Issues tracking a broad area of work topic-usability
Projects
None yet
Development

No branches or pull requests