* libservo: Improve finding python
Servo is built in a virtual environment, which sets `VIRTUAL_ENV` to
the base path of the venv.
PYTHON3 is only set if mach or the user set it.
Additional changes:
- Use Path / var_os for Paths instead of strings.
In General using Path APIs is preferable, since in rare cases
valid paths may not be valid utf-8.
- Don't search for python 3.8 anymore, since
we require a newer version anyway.
- Don't add the .exe suffix anymore, since
Command::new() will take care of that.
Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
Signed-off-by: Jonathan Schwender <jonathan.schwender@huawei.com>
* script: Improve finding python
Synchronize `find_python` in scripts build.rs with the version in
libservo.
Signed-off-by: Jonathan Schwender <jonathan.schwender@huawei.com>
* Apply suggestions from code review
Co-authored-by: Martin Robinson <mrobinson@igalia.com>
Signed-off-by: Jonathan Schwender <55576758+jschwe@users.noreply.github.com>
Signed-off-by: Jonathan Schwender <jonathan.schwender@huawei.com>
* Fix finding venv python on windows
- On Windows the venv scripts and python binaries are in the
`Scripts` subdirectory instead of `bin`.
- We shouldn't check if the executable in the venv binary dir
exists, since the actual file could have a `.exe` suffix.
We don't need to consider the `.exe` suffix for the filename,
since `Command` will handle that for us.
We also anyway check the validity of the candidate file, by
running `$candidate --version`.
Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
---------
Signed-off-by: Jonathan Schwender <schwenderjonathan@gmail.com>
Signed-off-by: Jonathan Schwender <jonathan.schwender@huawei.com>
Signed-off-by: Jonathan Schwender <55576758+jschwe@users.noreply.github.com>
Co-authored-by: Martin Robinson <mrobinson@igalia.com>
When playing around with Cargo’s new timing visualization:
https://internals.rust-lang.org/t/exploring-crate-graph-build-times-with-cargo-build-ztimings/10975/21
… I was surprised to see the `script` crate’s build script take 76 seconds.
I did not expect WebIDL bindings generation to be *that* computationally
intensive.
It turns out almost all of this time is overhead. The build script uses CMake
to generate bindings for each WebIDL file in parallel, but that causes a lot
of work to be repeated 366 times:
* Starting up a Python VM
* Importing (parts of) the Python standard library
* Importing ~16k lines of our Python code
* Recompiling the latter to bytecode, since we used `python -B` to disable
writing `.pyc` file
* Deserializing with `cPickle` and recreating in memory the results
of parsing all WebIDL files
----
This commit remove the use of CMake and cPickle for the `script` crate.
Instead, all WebIDL bindings generation is done sequentially
in a single Python process. This takes 2 to 3 seconds.
Add Windows x86 build job.
This will make it easier to start working on Hololens embedding work without having to deal with a broken build first.
---
- [x] `./mach build -d` does not report any errors
- [x] `./mach test-tidy` does not report any errors
- [x] There are tests for these changes
<!-- Reviewable:start -->
---
This change is [<img src="https://reviewable.io/review_button.svg" height="34" align="absmiddle" alt="Reviewable"/>](https://reviewable.io/reviews/servo/servo/23211)
<!-- Reviewable:end -->
It’s a compiler plugin that uses unstable compiler APIs
that are not on a path to stabilization.
With this changes, there is one less thing that might break
when we update the compiler. For example:
https://github.com/sfackler/rust-phf/pull/101