fonts: Fix emoji font selection on macOS (#32122)

This fixes two issues that were preventing emojis from being properly
selected from fonts on macOS.

1. `CTFont::get_glyphs_for_characters` takes the input characters as
   UniChar which are UTF-16 encoded characters. We need to encode the
   input `char` as UTF-16 before passing it to CoreText.
2. The font fallback list is updated with the latest logic from Gecko,
   which importantly adds "Apple Color Emoji" to the list of fallback
   fonts. Sorry for the big change, but this is just a direct port of
   the code from Gecko.

With these two changes, emojis display but in grayscale. 😅 To fix this,
another part of the font stack will need to detect when the font
supports color and pass that information to WebRender when creating the
font instance. We will likely do this in platform independent way later
that will depend on some more preliminary changes.

<!-- Please describe your changes on the following line: -->


---
<!-- Thank you for contributing to Servo! Please replace each `[ ]` by
`[X]` when the step is complete, and replace `___` with appropriate
data: -->
- [x] `./mach build -d` does not report any errors
- [x] `./mach test-tidy` does not report any errors
- [x] These changes are part of #17267.
- [x] There are tests for these changes, but the macOS CI does not
currently run WPT so we cannot observe the updated results.

<!-- Also, please make sure that "Allow edits from maintainers" checkbox
is checked, so that we can help you if you get stuck somewhere along the
way.-->

<!-- Pull requests that do not address these steps are welcome, but they
will require additional verification as part of the review process. -->
This commit is contained in:
Martin Robinson 2024-04-22 12:40:55 +02:00 committed by GitHub
parent 821893b2ee
commit 363651c7f7
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
2 changed files with 230 additions and 148 deletions

View file

@ -10,7 +10,6 @@ use std::{fmt, ptr};
/// Implementation of Quartz (CoreGraphics) fonts.
use app_units::Au;
use byteorder::{BigEndian, ByteOrder};
use core_foundation::base::CFIndex;
use core_foundation::data::CFData;
use core_foundation::string::UniChar;
use core_graphics::font::CGGlyph;
@ -209,17 +208,26 @@ impl PlatformFontMethods for PlatformFont {
}
fn glyph_index(&self, codepoint: char) -> Option<GlyphId> {
let characters: [UniChar; 1] = [codepoint as UniChar];
let mut glyphs: [CGGlyph; 1] = [0 as CGGlyph];
let count: CFIndex = 1;
// CTFontGetGlyphsForCharacters takes UniChar, which are UTF-16 encoded characters. We are taking
// a char here which is a 32bit Unicode character. This will encode into a maximum of two
// UTF-16 code units and produce a maximum of 1 glyph. We could safely pass 2 as the length
// of the buffer to CTFontGetGlyphsForCharacters, but passing the actual number of encoded
// code units ensures that the resulting glyph is always placed in the first slot in the output
// buffer.
let mut characters: [UniChar; 2] = [0, 0];
let encoded_characters = codepoint.encode_utf16(&mut characters);
let mut glyphs: [CGGlyph; 2] = [0, 0];
let result = unsafe {
self.ctfont
.get_glyphs_for_characters(characters.as_ptr(), glyphs.as_mut_ptr(), count)
self.ctfont.get_glyphs_for_characters(
encoded_characters.as_ptr(),
glyphs.as_mut_ptr(),
encoded_characters.len() as isize,
)
};
// If the call failed or the glyph is the zero glyph no glyph was found for this character.
if !result || glyphs[0] == 0 {
// No glyph for this character
return None;
}