Auto merge of #24761 - servo-wpt-sync:wpt_update_17-11-2019, r=jdm

Sync WPT with upstream (17-11-2019)

Automated downstream sync of changes from upstream as of 17-11-2019.
[no-wpt-sync]
r? @servo-wpt-sync
This commit is contained in:
bors-servo 2019-11-17 15:12:16 -05:00 committed by GitHub
commit d1db623dfc
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
391 changed files with 5972 additions and 7614 deletions

View file

@ -0,0 +1,4 @@
[2d.transformation.getTransform.html]
[This test ensures that getTransform works correctly.]
expected: FAIL

View file

@ -14,9 +14,6 @@
[Revoke blob URL after creating Request, will fetch]
expected: FAIL
[Revoke blob URL after calling fetch, fetch should succeed]
expected: FAIL
[url-with-fetch.any.html]
[Untitled]
@ -37,3 +34,6 @@
[Revoke blob URL after creating Request, will fetch]
expected: FAIL
[Revoke blob URL after calling fetch, fetch should succeed]
expected: FAIL

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,2 @@
[floats-in-table-caption-001.html]
expected: FAIL

View file

@ -0,0 +1,4 @@
[hit-test-floats-001.html]
[hit-test-floats-001]
expected: FAIL

View file

@ -5,9 +5,9 @@
[[data-expected-height\] 7]
expected: FAIL
[[data-expected-height\] 3]
[[data-expected-height\] 1]
expected: FAIL
[[data-expected-height\] 4]
[[data-expected-height\] 2]
expected: FAIL

View file

@ -1,2 +0,0 @@
[white-space-002.xht]
expected: FAIL

View file

@ -1,2 +0,0 @@
[white-space-003.xht]
expected: FAIL

View file

@ -1,2 +0,0 @@
[line-height-204.html]
expected: FAIL

View file

@ -1,2 +0,0 @@
[mix-blend-mode-paragraph.html]
expected: FAIL

View file

@ -1,7 +0,0 @@
[CSSPseudoElement-getAnimations.tentative.html]
[getAnimations returns CSSAnimation objects]
expected: FAIL
[getAnimations returns CSS transitions/animations, and script-generated animations in the expected order]
expected: FAIL

View file

@ -1,2 +0,0 @@
[background-repeat-round-roundup.xht]
expected: FAIL

View file

@ -0,0 +1,2 @@
[hyphens-out-of-flow-001.html]
expected: FAIL

View file

@ -0,0 +1,2 @@
[line-break-normal-018.xht]
expected: FAIL

View file

@ -0,0 +1,2 @@
[line-break-strict-018.xht]
expected: FAIL

View file

@ -0,0 +1,2 @@
[text-transform-full-size-kana-001.html]
expected: FAIL

View file

@ -0,0 +1,2 @@
[text-transform-full-size-kana-002.html]
expected: FAIL

View file

@ -0,0 +1,2 @@
[text-transform-full-size-kana-003.html]
expected: FAIL

View file

@ -0,0 +1,2 @@
[text-transform-full-size-kana-004.html]
expected: FAIL

View file

@ -0,0 +1,2 @@
[trailing-ideographic-space-004.html]
expected: FAIL

View file

@ -1,2 +0,0 @@
[word-break-break-all-007.html]
expected: FAIL

View file

@ -1,2 +0,0 @@
[word-break-keep-all-006.html]
expected: FAIL

View file

@ -1,5 +1,5 @@
[perspective-interpolation.html]
expected: ERROR
expected: CRASH
[ perspective interpolation]
expected: FAIL

View file

@ -1,2 +0,0 @@
[css-transforms-3d-on-anonymous-block-001.html]
expected: FAIL

View file

@ -1,4 +0,0 @@
[CSSPseudoElement-getAnimations.tentative.html]
[getAnimations sorts simultaneous transitions by name]
expected: FAIL

View file

@ -1,2 +1,2 @@
[no-transition-from-ua-to-blocking-stylesheet.html]
expected: TIMEOUT
expected: FAIL

View file

@ -1,4 +0,0 @@
[elementsFromPoint-invalid-cases.html]
[The root element is the last element returned for otherwise empty queries within the viewport]
expected: FAIL

View file

@ -0,0 +1,2 @@
[HTMLMediaElement.html]
expected: CRASH

View file

@ -0,0 +1,2 @@
[contenttype_html.html]
expected: CRASH

View file

@ -0,0 +1,2 @@
[contenttype_txt.html]
expected: CRASH

View file

@ -0,0 +1,2 @@
[contenttype_xml.html]
expected: CRASH

View file

@ -0,0 +1,15 @@
[response-stream-disturbed-by-pipe.any.html]
[using pipeThrough on Response body should disturb it synchronously]
expected: FAIL
[using pipeTo on Response body should disturb it synchronously]
expected: FAIL
[response-stream-disturbed-by-pipe.any.worker.html]
[using pipeThrough on Response body should disturb it synchronously]
expected: FAIL
[using pipeTo on Response body should disturb it synchronously]
expected: FAIL

View file

@ -312,9 +312,21 @@
[<iframe>: separate response Content-Type: */* text/html]
expected: FAIL
[<iframe>: combined response Content-Type: */* text/html]
[<iframe>: combined response Content-Type: text/html;" text/plain]
expected: FAIL
[<iframe>: combined response Content-Type: text/html;" \\" text/plain]
[<iframe>: combined response Content-Type: text/html;charset=gbk text/plain text/html]
expected: FAIL
[<iframe>: separate response Content-Type: text/html;" text/plain]
expected: FAIL
[<iframe>: separate response Content-Type: text/html */*]
expected: FAIL
[<iframe>: separate response Content-Type: text/plain */*]
expected: FAIL
[<iframe>: combined response Content-Type: text/html */*;charset=gbk]
expected: FAIL

View file

@ -56,3 +56,6 @@
[separate text/javascript x/x]
expected: FAIL
[separate text/javascript error]
expected: FAIL

View file

@ -0,0 +1,4 @@
[split-cache.tentative.html]
[HTTP Cache - Partioning by top-level origin 1]
expected: FAIL

View file

@ -11,6 +11,9 @@
[X-Content-Type-Options%3A%20nosniff%0C]
expected: FAIL
[X-Content-Type-Options%3A%20%40%23%24%23%25%25%26%5E%26%5E*()()11!%2Cnosniff]
[X-Content-Type-Options%3A%20%2Cnosniff]
expected: FAIL
[Content-Type-Options%3A%20nosniff]
expected: FAIL

View file

@ -5,7 +5,7 @@
expected: FAIL
[Embedded credentials are treated as network errors in frames.]
expected: TIMEOUT
expected: FAIL
[Embedded credentials are treated as network errors in new windows.]
expected: FAIL

View file

@ -1,4 +1,4 @@
[traverse_the_history_5.html]
[traverse_the_history_2.html]
[Multiple history traversals, last would be aborted]
expected: FAIL

View file

@ -0,0 +1,4 @@
[traverse_the_history_3.html]
[Multiple history traversals, last would be aborted]
expected: FAIL

View file

@ -0,0 +1,4 @@
[traverse_the_history_4.html]
[Multiple history traversals, last would be aborted]
expected: FAIL

View file

@ -1,7 +1,7 @@
[document_domain_access_details.sub.html]
expected: TIMEOUT
[Access allowed if same-origin with no 'document.domain' modification. (Sanity check)]
expected: TIMEOUT
expected: FAIL
[Access is revoked to Window object when we stop being same effective script origin due to document.domain.]
expected: NOTRUN
@ -13,7 +13,7 @@
expected: NOTRUN
[Access not allowed if different-origin with no 'document.domain' modification. (Sanity check)]
expected: NOTRUN
expected: FAIL
[Access disallowed again if same-origin, both set document-domain to existing value, then one sets to parent.]
expected: NOTRUN
@ -22,13 +22,13 @@
expected: NOTRUN
[Access allowed if same-origin and both set document.domain to existing value.]
expected: NOTRUN
expected: TIMEOUT
[Access is not revoked to Document object when we stop being same effective script origin due to document.domain.]
expected: NOTRUN
[Access disallowed if same-origin but only one sets document.domain.]
expected: NOTRUN
expected: FAIL
[Access evolves correctly for cross-origin objects when we join up via document.domain and then diverge again.]
expected: NOTRUN

View file

@ -0,0 +1,4 @@
[sandbox-disallow-scripts-via-unsandboxed-popup.tentative.html]
[Sandboxed => unsandboxed popup]
expected: FAIL

View file

@ -5,3 +5,12 @@
[Element with tabindex should support autofocus]
expected: FAIL
[Host element with delegatesFocus including no focusable descendants should be skipped]
expected: FAIL
[Area element should support autofocus]
expected: FAIL
[Host element with delegatesFocus should support autofocus]
expected: FAIL

View file

@ -1,6 +1,5 @@
[iframe_sandbox_popups_nonescaping-3.html]
type: testharness
expected: TIMEOUT
[Check that popups from a sandboxed iframe do not escape the sandbox]
expected: NOTRUN
expected: FAIL

View file

@ -0,0 +1,10 @@
[non-active-document.html]
[DOMParser]
expected: FAIL
[createHTMLDocument]
expected: FAIL
[<template>]
expected: FAIL

View file

@ -1,5 +1,5 @@
[form-double-submit-2.html]
expected: ERROR
[preventDefault should allow onclick submit() to succeed]
expected: FAIL
expected: TIMEOUT

View file

@ -1,5 +1,5 @@
[form-double-submit-3.html]
expected: ERROR
[<button> should have the same double-submit protection as <input type=submit>]
expected: FAIL
expected: TIMEOUT

View file

@ -1,5 +1,4 @@
[form-submission-algorithm.html]
expected: TIMEOUT
[If form's firing submission events is true, then return; 'submit' event]
expected: FAIL
@ -18,6 +17,3 @@
[firing an event named submit; form.requestSubmit()]
expected: FAIL
[Cannot navigate (after constructing the entry list)]
expected: TIMEOUT

View file

@ -0,0 +1,5 @@
[button-submit-children.html]
expected: TIMEOUT
[This test will pass if a form navigation successfully occurs when clicking a child element of a <button type=submit> element with a onclick event handler which prevents the default form submission and manually calls form.submit() instead.]
expected: TIMEOUT

View file

@ -0,0 +1,2 @@
[script-onerror-insertion-point-2.html]
expected: TIMEOUT

View file

@ -1,4 +0,0 @@
[DOMContentLoaded-defer.html]
[The end: DOMContentLoaded and defer scripts]
expected: FAIL

View file

@ -0,0 +1,4 @@
[2d.transformation.getTransform.html]
[This test ensures that getTransform works correctly.]
expected: FAIL

View file

@ -12,6 +12,3 @@
[Verifies the resolution of entry.startTime is at least 5 microseconds.]
expected: TIMEOUT
[Verifies the resolution of performance.now() is at least 5 microseconds.]
expected: FAIL

View file

@ -1,5 +1,5 @@
[nested-context-navigations-iframe.html]
expected: TIMEOUT
expected: CRASH
[Test that iframe navigations are not observable by the parent, even after history navigations by the parent]
expected: FAIL

View file

@ -1,2 +0,0 @@
[resource_timing_buffer_full_eventually.html]
expected: CRASH

View file

@ -0,0 +1,23 @@
[multi-value.any.worker.html]
expected: TIMEOUT
[multiple return values from wasm to js]
expected: TIMEOUT
[multiple return values inside wasm]
expected: NOTRUN
[multiple return values from js to wasm]
expected: NOTRUN
[multi-value.any.html]
expected: TIMEOUT
[multiple return values from wasm to js]
expected: TIMEOUT
[multiple return values inside wasm]
expected: NOTRUN
[multiple return values from js to wasm]
expected: NOTRUN

View file

@ -1,5 +1,4 @@
[realtimeanalyser-fft-scaling.html]
expected: TIMEOUT
[X 2048-point FFT peak position is not equal to 64. Got 0.]
expected: FAIL

View file

@ -0,0 +1,10 @@
[validity.py]
[test_pause_positive_integer[key\]]
expected: FAIL
[test_pause_positive_integer[none\]]
expected: FAIL
[test_pause_positive_integer[pointer\]]
expected: FAIL

View file

@ -0,0 +1,5 @@
[018.html]
expected: TIMEOUT
[origin of the script that invoked the method, javascript:]
expected: TIMEOUT

View file

@ -0,0 +1,5 @@
[017.html]
expected: TIMEOUT
[origin of the script that invoked the method, about:blank]
expected: TIMEOUT

View file

@ -1,4 +0,0 @@
[WorkerGlobalScope-close.html]
[Test sending a message after closing.]
expected: FAIL

View file

@ -1,4 +1,5 @@
[import-in-moduleworker.html]
expected: ERROR
[Base URL in module dedicated workers: import]
expected: FAIL

View file

@ -1,5 +1,6 @@
[003.html]
type: testharness
expected: ERROR
[shared]
expected: FAIL

View file

@ -0,0 +1,2 @@
[transition_calc_implicit.html]
expected: TIMEOUT

View file

@ -0,0 +1,4 @@
[animation-removed-node.html]
[Animations are no longer active when a node can't be animated.]
expected: FAIL

View file

@ -73,7 +73,16 @@ tasks:
A subset of WPT's "${chunk[0]}" tests (chunk number ${chunk[1]}
of ${chunk[2]}), run in the ${browser.channel} release of
${browser.name}.
owner: ${event.pusher.email}
owner:
# event.pusher.email is null when it comes from a GitHub action, so it has to be null-checked,
# and using the "in" operator causes an evaluation error when the right variable is null, if its
# done on the same "if" statement than the null-check (with &&), therefore we use a nested "if" here.
$if: 'event.pusher.email'
then:
$if: '"@" in event.pusher.email'
then: ${event.pusher.email}
else: web-platform-tests@users.noreply.github.com
else: web-platform-tests@users.noreply.github.com
source: ${event.repository.url}
payload:
image:

View file

@ -0,0 +1,39 @@
<script src="/resources/testharness.js"></script>
<script src="/resources/testharnessreport.js"></script>
<body>
<script>
// Ensure that context2d.getTransform works
const epsilon = 1e-5;
const canvas = document.createElement('canvas');
const ctx = canvas.getContext('2d');
test(function(t) {
assert_array_equals(ctx.getTransform().toFloat32Array(),
[1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1],
"Assert that an untransformed matrix is identity");
ctx.scale(2, 3);
transform = ctx.getTransform();
assert_array_equals(ctx.getTransform().toFloat32Array(),
[2, 0, 0, 0, 0, 3, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1],
"Assert that context2d scaling works");
ctx.rotate(Math.PI/2);
transform = ctx.getTransform();
assert_array_approx_equals(ctx.getTransform().toFloat32Array(),
[0, 3, 0, 0, -2, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1], epsilon,
"Assert that context2d rotate works");
ctx.translate(1, -1);
transform = ctx.getTransform();
assert_array_approx_equals(ctx.getTransform().toFloat32Array(),
[0, 3, 0, 0, -2, 0, 0, 0, 0, 0, 1, 0, 2, 3, 0, 1], epsilon,
"Assert context2d translate works.");
ctx.resetTransform();
assert_array_equals(ctx.getTransform().toFloat32Array(),
[1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1, 0, 0, 0, 0, 1],
"Assert that a reset matrix is identity");
}, 'This test ensures that getTransform works correctly.');
</script>
</body>

View file

@ -9,6 +9,16 @@ idl_test(
['BackgroundSync'],
['service-workers', 'html', 'dom'],
idlArray => {
// TODO: Objects
const isServiceWorker = location.pathname.includes('.serviceworker.');
if (isServiceWorker) {
idl_array.add_objects({
ServiceWorkerGlobalScope: ['self', 'onsync', 'onperiodicsync'],
ServiceWorkerRegistration: ['registration'],
SyncManager: ['registration.sync'],
PeriodicSyncManager: ['registration.periodicSync'],
SyncEevnt: ['new SyncEvent("tag", "lastChance")'],
PeriodicSyncEevnt: ['new PeriodicSyncEvent("tag")'],
});
}
}
);

View file

@ -0,0 +1,114 @@
<!DOCTYPE html>
<title>IDBObjectStore.delete() and IDBCursor.continue() - object store - remove a record from the object store while iterating cursor</title>
<link rel="author" title="Mozilla" href="https://www.mozilla.org">
<script src="/resources/testharness.js"></script>
<script src="/resources/testharnessreport.js"></script>
<script src="support.js"></script>
<script>
/* The goal here is to test that any prefetching of cursor values performs
* correct invalidation of prefetched data. This test is motivated by the
* particularities of the Firefox implementation of preloading, and is
* specifically motivated by an edge case when prefetching prefetches at
* least 2 extra records and at most determines whether a mutation is
* potentially relevant based on current cursor position and direction and
* does not test for key equivalence. Future implementations may want to
* help refine this test if their cursors are more clever.
*
* Step-wise we:
* - Open a cursor, returning key 0.
* - When the cursor request completes, without yielding control:
* - Issue a delete() call that won't actually delete anything but looks
* relevant. This should purge prefetched records 1 and 2.
* - Issue a continue() which should result in record 1 being fetched
* again and record 2 being prefetched again.
* - Delete record 2. Unless there's a synchronously available source
* of truth, the data from continue() above will not be present and
* we'll expect the implementation to need to set a flag to invalidate
* the prefetched data when it arrives.
* - When the cursor request completes, validate we got record 1 and issue
* a continue.
* - When the request completes, we should have a null cursor result value
* because 2 was deleted.
*/
var db,
count = 0,
t = async_test(),
records = [ { pKey: "primaryKey_0" },
{ pKey: "primaryKey_1" },
{ pKey: "primaryKey_2" } ];
// This is a key that is not present in the database, but that is known to
// be relevant to a forward iteration of the above keys by comparing to be
// greater than all of them.
var plausibleFutureKey = "primaryKey_9";
var open_rq = createdb(t);
open_rq.onupgradeneeded = function(e) {
db = e.target.result;
var objStore = db.createObjectStore("test", { keyPath: "pKey" });
for (var i = 0; i < records.length; i++)
objStore.add(records[i]);
};
open_rq.onsuccess = t.step_func(CursorDeleteRecord);
function CursorDeleteRecord(e) {
var txn = db.transaction("test", "readwrite"),
object_store = txn.objectStore("test"),
cursor_rq = object_store.openCursor();
var iteration = 0;
cursor_rq.onsuccess = t.step_func(function(e) {
var cursor = e.target.result;
switch (iteration) {
case 0:
object_store.delete(plausibleFutureKey);
assert_true(cursor != null, "cursor valid");
assert_equals(cursor.value.pKey, records[iteration].pKey);
cursor.continue();
object_store.delete(records[2].pKey);
break;
case 1:
assert_true(cursor != null, "cursor valid");
assert_equals(cursor.value.pKey, records[iteration].pKey);
cursor.continue();
break;
case 2:
assert_equals(cursor, null, "cursor no longer valid");
break;
};
iteration++;
});
txn.oncomplete = t.step_func(VerifyRecordWasDeleted);
}
function VerifyRecordWasDeleted(e) {
var cursor_rq = db.transaction("test")
.objectStore("test")
.openCursor();
cursor_rq.onsuccess = t.step_func(function(e) {
var cursor = e.target.result;
if (!cursor) {
assert_equals(count, 2, 'count');
t.done();
}
assert_equals(cursor.value.pKey, records[count].pKey);
count++;
cursor.continue();
});
}
</script>
<div id="log"></div>

View file

@ -48,128 +48,9 @@ i.e. use `git pull --prune` (or `git fetch -p && git merge`).
Running the Tests
=================
The tests are designed to be run from your local computer. The test
environment requires [Python 2.7+](http://www.python.org/downloads) (but not Python 3.x).
On Windows, be sure to add the Python directory (`c:\python2x`, by default) to
your `%Path%` [Environment Variable](http://www.computerhope.com/issues/ch000549.htm),
and read the [Windows Notes](#windows-notes) section below.
To get the tests running, you need to set up the test domains in your
[`hosts` file](http://en.wikipedia.org/wiki/Hosts_%28file%29%23Location_in_the_file_system).
The necessary content can be generated with `./wpt make-hosts-file`; on
Windows, you will need to precede the prior command with `python` or
the path to the Python binary (`python wpt make-hosts-file`).
For example, on most UNIX-like systems, you can setup the hosts file with:
```bash
./wpt make-hosts-file | sudo tee -a /etc/hosts
```
And on Windows (this must be run in a PowerShell session with Administrator privileges):
```powershell
python wpt make-hosts-file | Out-File $env:systemroot\System32\drivers\etc\hosts -Encoding ascii -Append
```
If you are behind a proxy, you also need to make sure the domains above are
excluded from your proxy lookups.
Running Tests Manually
======================
The test server can be started using
```
./wpt serve
```
**On Windows**: You will need to precede the prior command with
`python` or the path to the python binary.
```bash
python wpt serve
```
This will start HTTP servers on two ports and a websockets server on
one port. By default the web servers start on ports 8000 and 8443 and
the other ports are randomly-chosen free ports. Tests must be loaded
from the *first* HTTP server in the output. To change the ports,
create a `config.json` file in the wpt root directory, and add
port definitions of your choice e.g.:
```
{
"ports": {
"http": [1234, "auto"],
"https":[5678]
}
}
```
After your `hosts` file is configured, the servers will be locally accessible at:
http://web-platform.test:8000/<br>
https://web-platform.test:8443/ *
To use the web-based runner point your browser to:
http://web-platform.test:8000/tools/runner/index.html <br>
https://web-platform.test:8443/tools/runner/index.html *
\**See [Trusting Root CA](./tools/certs/README.md)*
Running Tests Automatically
---------------------------
Tests can be run automatically in a browser using the `run` command of
the `wpt` script in the root of the checkout. This requires the hosts
file setup documented above, but you must *not* have the
test server already running when calling `wpt run`. The basic command
line syntax is:
```bash
./wpt run product [tests]
```
**On Windows**: You will need to precede the prior command with
`python` or the path to the python binary.
```bash
python wpt run product [tests]
```
where `product` is currently `firefox` or `chrome` and `[tests]` is a
list of paths to tests. This will attempt to automatically locate a
browser instance and install required dependencies. The command is
very configurable; for example to specify a particular binary use
`wpt run --binary=path product`. The full range of options can be see
with `wpt run --help` and `wpt run --wptrunner-help`.
Not all dependencies can be automatically installed; in particular the
`certutil` tool required to run https tests with Firefox must be
installed using a system package manager or similar.
On Debian/Ubuntu certutil may be installed using:
```
sudo apt install libnss3-tools
```
And on macOS with homebrew using:
```
brew install nss
```
On other platforms, download the firefox archive and common.tests.tar.gz
archive for your platform from
[Mozilla CI](https://archive.mozilla.org/pub/firefox/nightly/latest-mozilla-central/).
Then extract `certutil[.exe]` from the tests.tar.gz package and
`libnss3[.so|.dll|.dynlib]` and put the former on your path and the latter on
your library path.
See the [documentation website](https://web-platform-tests.org/running-tests/)
and in particular the
[system setup for running tests locally](https://web-platform-tests.org/running-tests/from-local-system.html#system-setup).
Command Line Tools
==================

View file

@ -6,12 +6,6 @@ portable annotations. The tools in this directory, along with the
sample files supplied, can be used to exercise the vocabulary
"implementation" against various RDF processing engines.
ruby-rdf
========
This directory contains a Ruby script that will evaluate the samples. See
the README.md file in that directory for more information.
vocab-tester.py
===============

View file

@ -1,12 +0,0 @@
source "https://rubygems.org"
ruby '2.3.1'
gem 'rdf', github: "ruby-rdf/rdf", branch: 'develop'
gem 'rdf-rdfxml', github: "ruby-rdf/rdf-rdfxml", branch: 'develop'
gem 'linkeddata'
gem 'jsonlint'
gem 'rspec'
gem 'rake'
gem 'byebug'

View file

@ -1,192 +0,0 @@
GIT
remote: git://github.com/ruby-rdf/rdf-rdfxml.git
revision: 8a12a78aa28f3a0f58926ae77844d3e7af52ea4c
branch: develop
specs:
rdf-rdfxml (2.0.0)
htmlentities (~> 4.3)
rdf (~> 2.0)
rdf-rdfa (~> 2.0)
rdf-xsd (~> 2.0)
GIT
remote: git://github.com/ruby-rdf/rdf.git
revision: 4740b4a52bf358656d01d93adc5174d5fe07aec8
branch: develop
specs:
rdf (2.1.0)
hamster (~> 3.0)
link_header (~> 0.0, >= 0.0.8)
GEM
remote: https://rubygems.org/
specs:
addressable (2.5.0)
public_suffix (~> 2.0, >= 2.0.2)
bcp47 (0.3.3)
i18n
builder (3.2.2)
byebug (9.0.6)
concurrent-ruby (1.0.2)
crack (0.4.3)
safe_yaml (~> 1.0.0)
diff-lcs (1.2.5)
ebnf (1.0.1)
rdf (~> 2.0)
sxp (~> 1.0)
equivalent-xml (0.6.0)
nokogiri (>= 1.8.2)
haml (4.0.7)
tilt
hamster (3.0.0)
concurrent-ruby (~> 1.0)
hashdiff (0.3.0)
htmlentities (4.3.4)
i18n (0.7.0)
json-ld (2.1.0)
multi_json (~> 1.11)
rdf (~> 2.1)
jsonlint (0.2.0)
oj (~> 2)
trollop (~> 2)
ld-patch (0.3.0)
ebnf (~> 1.0, >= 1.0.1)
rdf (~> 2.0)
rdf-xsd (~> 2.0)
sparql (~> 2.0)
sxp (~> 1.0)
link_header (0.0.8)
linkeddata (2.0.0)
equivalent-xml (~> 0.6)
json-ld (~> 2.0)
ld-patch (~> 0.3)
nokogiri (~> 1.8.2)
rdf (~> 2.0)
rdf-aggregate-repo (~> 2.0)
rdf-isomorphic (~> 2.0)
rdf-json (~> 2.0)
rdf-microdata (~> 2.0)
rdf-n3 (~> 2.0)
rdf-rdfa (~> 2.0)
rdf-rdfxml (~> 2.0)
rdf-reasoner (~> 0.4)
rdf-tabular (~> 0.4)
rdf-trig (~> 2.0)
rdf-trix (~> 2.0)
rdf-turtle (~> 2.0)
rdf-vocab (~> 2.0)
rdf-xsd (~> 2.0)
sparql (~> 2.0)
sparql-client (~> 2.0)
mini_portile2 (2.1.0)
multi_json (1.12.1)
net-http-persistent (2.9.4)
nokogiri (~> 1.8.2)
mini_portile2 (~> 2.1.0)
oj (2.17.5)
public_suffix (2.0.4)
rake (11.3.0)
rdf-aggregate-repo (2.0.0)
rdf (~> 2.0)
rdf-isomorphic (2.0.0)
rdf (~> 2.0)
rdf-json (2.0.0)
rdf (~> 2.0)
rdf-microdata (2.0.3)
htmlentities (~> 4.3)
nokogiri (~> 1.8.2)
rdf (~> 2.0)
rdf-xsd (~> 2.0)
rdf-n3 (2.0.0)
rdf (~> 2.0)
rdf-rdfa (2.0.1)
haml (~> 4.0)
htmlentities (~> 4.3)
rdf (~> 2.0)
rdf-aggregate-repo (~> 2.0)
rdf-xsd (~> 2.0)
rdf-reasoner (0.4.0)
rdf (~> 2.0)
rdf-spec (~> 2.0)
rdf-vocab (~> 2.0)
rdf-xsd (~> 2.0)
rdf-spec (2.0.0)
rdf (~> 2.0)
rdf-isomorphic (~> 2.0)
rspec (~> 3.0)
rspec-its (~> 1.0)
webmock (~> 1.17)
rdf-tabular (0.4.0)
addressable (~> 2.3)
bcp47 (~> 0.3, >= 0.3.3)
json-ld (~> 2.0)
rdf (~> 2.0)
rdf-vocab (~> 2.0)
rdf-xsd (~> 2.0)
rdf-trig (2.0.0)
ebnf (~> 1.0, >= 1.0.1)
rdf (~> 2.0)
rdf-turtle (~> 2.0)
rdf-trix (2.0.0)
rdf (~> 2.0)
rdf-turtle (2.0.0)
ebnf (~> 1.0, >= 1.0.1)
rdf (~> 2.0)
rdf-vocab (2.1.0)
rdf (~> 2.1)
rdf-xsd (2.0.0)
rdf (~> 2.0)
rspec (3.5.0)
rspec-core (~> 3.5.0)
rspec-expectations (~> 3.5.0)
rspec-mocks (~> 3.5.0)
rspec-core (3.5.4)
rspec-support (~> 3.5.0)
rspec-expectations (3.5.0)
diff-lcs (>= 1.2.0, < 2.0)
rspec-support (~> 3.5.0)
rspec-its (1.2.0)
rspec-core (>= 3.0.0)
rspec-expectations (>= 3.0.0)
rspec-mocks (3.5.0)
diff-lcs (>= 1.2.0, < 2.0)
rspec-support (~> 3.5.0)
rspec-support (3.5.0)
safe_yaml (1.0.4)
sparql (2.0.0)
builder (~> 3.2)
ebnf (~> 1.0, >= 1.0.1)
rdf (~> 2.0)
rdf-aggregate-repo (~> 2.0)
rdf-xsd (~> 2.0)
sparql-client (~> 2.0)
sxp (~> 1.0)
sparql-client (2.1.0)
net-http-persistent (~> 2.9)
rdf (~> 2.0)
sxp (1.0.0)
rdf (~> 2.0)
tilt (2.0.5)
trollop (2.1.2)
webmock (1.24.6)
addressable (>= 2.3.6)
crack (>= 0.3.2)
hashdiff
PLATFORMS
ruby
DEPENDENCIES
byebug
jsonlint
linkeddata
rake
rdf!
rdf-rdfxml!
rspec
RUBY VERSION
ruby 2.3.1p112
BUNDLED WITH
1.12.5

View file

@ -1,5 +0,0 @@
## Annotation Vocabulary test in Ruby
Setup by installing Ruby 2.3.1, and running `bundle install`. Then, to run tests: `bundle exec rake`.
To get formatted output, run `bundle exec rspec annotation-vocab_spec.rb -f h -o RESULTS.md`

View file

@ -1,11 +0,0 @@
require "bundler/setup"
begin
require 'rspec/core/rake_task'
RSpec::Core::RakeTask.new(:spec) do |t|
t.pattern = "annotation-vocab_spec.rb"
end
task :default => :spec
rescue LoadError
end

View file

@ -1,120 +0,0 @@
require "bundler/setup"
require 'rdf'
require 'linkeddata'
require 'rdf/spec/matchers'
require 'rspec'
SAMPLES = File.expand_path("../../samples/", __FILE__)
CORRECT = Dir.glob(File.join(SAMPLES, "correct/*.json"))
INCORRECT = Dir.glob(File.join(SAMPLES, "incorrect/*.json"))
RDF::Reasoner.apply(:rdfs, :owl)
VOCAB_URI = "http://www.w3.org/ns/oa#"
VOCAB_GRAPH = begin
g = RDF::Graph.load(VOCAB_URI, format: :jsonld, headers: {"Accept" => "application/ld+json"})
g.each_object {|o| o.squish! if o.literal?}
g
end
describe "Web Annotation Vocab" do
let(:vocab) {VOCAB_URI}
let(:vocab_graph) {VOCAB_GRAPH}
# Load Annotation vocabulary first, so that that defined in rdf-vocab is not lazy-loaded
before(:all) do
RDF::Vocabulary.from_graph(VOCAB_GRAPH, url: VOCAB_URI, class_name: RDF::Vocab::OA)
end
it "The JSON-LD context document can be parsed without errors by JSON-LD validators" do
expect {JSON::LD::API.expand(vocab, validate: true)}.not_to raise_error
end
context "The JSON-LD context document can be used to convert JSON-LD serialized Annotations into RDF triples" do
CORRECT.each do |file|
it "#{file.split('/').last}" do
nt = file.sub(/.json$/, '.nt')
gjld = RDF::Graph.load(file, format: :jsonld)
gnt = RDF::Graph.load(nt, format: :ntriples)
expect(gjld).to be_equivalent_graph(gnt)
end
it "lint #{file.split('/').last}" do
gjld = RDF::Graph.load(file, format: :jsonld)
gjld.entail!
expect(gjld.lint).to be_empty
end
end
end
context "detects errors in incorrect examples" do
INCORRECT.each do |file|
it "#{file.split('/').last}" do
pending "Empty Documents are invalid" if file =~ /anno2.json|anno3.json/
expect {RDF::Graph.load(file, validate: true, format: :jsonld, logger: false)}.to raise_error(RDF::ReaderError)
end
end
end
context "The ontology documents can be parsed without errors by RDF Schema validators" do
{
jsonld: "application/ld+json",
rdfxml: "application/rdf+xml",
ttl: "text/turtle",
}.each do |format, content_type|
it "JSON-LD version is isomorphic to #{format}" do
expect do
RDF::Graph.load(vocab, format: format, validate: true, headers: {"Accept" => content_type})
end.not_to raise_error
end
end
end
context "The ontology documents are isomorphic to each other" do
{
rdfxml: "application/rdf+xml",
ttl: "text/turtle",
}.each do |format, content_type|
it format do
fg = RDF::Graph.load(vocab, format: format, headers: {"Accept" => content_type})
# XXX Normalize whitespace in literals to ease comparision
fg.each_object {|o| o.squish! if o.literal?}
expect(fg).to be_equivalent_graph(vocab_graph)
end
end
end
context "The ontology is internally consistent with respect to domains, ranges, inverses, and any other ontology features specified." do
it "lints cleanly" do
entailed_graph = vocab_graph.dup.entail!
expect(entailed_graph.lint).to be_empty
end
RDF::Vocab::OA.each do |term|
if term.type.to_s =~ /Class/
context term.pname do
it "subClassOf" do
expect {term.subClassOf.map(&:pname)}.not_to raise_error
end
it "equivalentClass" do
expect {term.equivalentClass.map(&:pname)}.not_to raise_error
end
end
elsif term.type.to_s =~ /Property/
context term.pname do
it "subPropertyOf" do
expect {term.subPropertyOf.map(&:pname)}.not_to raise_error
end
it "domain" do
expect {term.domain.map(&:pname)}.not_to raise_error
end
it "range" do
expect {term.range.map(&:pname)}.not_to raise_error
end
it "equivalentProperty" do
expect {term.equivalentProperty.map(&:pname)}.not_to raise_error
end
end
end
end
end
end

View file

@ -1,7 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno1",
"type": "Annotation",
"body": "http://example.org/post1",
"target": "http://example.com/page1"
}

View file

@ -1,3 +0,0 @@
<http://example.org/anno1> <http://www.w3.org/ns/oa#hasTarget> <http://example.com/page1> .
<http://example.org/anno1> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/anno1> <http://www.w3.org/ns/oa#hasBody> <http://example.org/post1> .

View file

@ -1,19 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno10",
"type": "Annotation",
"body": {
"type": "Choice",
"items": [
{
"id": "http://example.org/note1",
"language": "en"
},
{
"id": "http://example.org/note2",
"language": "fr"
}
]
},
"target": "http://example.org/website1"
}

View file

@ -1,11 +0,0 @@
_:b2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.org/note2> .
_:b2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> <http://www.w3.org/1999/02/22-rdf-syntax-ns#nil> .
_:b1 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.org/note1> .
_:b1 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> _:b2 .
<http://example.org/note2> <http://purl.org/dc/elements/1.1/language> "fr" .
_:b0 <http://www.w3.org/ns/activitystreams#items> _:b1 .
_:b0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Choice> .
<http://example.org/anno10> <http://www.w3.org/ns/oa#hasTarget> <http://example.org/website1> .
<http://example.org/anno10> <http://www.w3.org/ns/oa#hasBody> _:b0 .
<http://example.org/anno10> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/note1> <http://purl.org/dc/elements/1.1/language> "en" .

View file

@ -1,18 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno11",
"type": "Annotation",
"motivation": "commenting",
"body": {
"type": "TextualBody",
"value": "These pages together provide evidence of the conspiracy"
},
"target": {
"type": "Composite",
"items": [
"http://example.com/page1",
"http://example.org/page6",
"http://example.net/page4"
]
}
}

View file

@ -1,14 +0,0 @@
_:b2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> _:b3 .
_:b2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.com/page1> .
<http://example.org/anno11> <http://www.w3.org/ns/oa#hasBody> _:b0 .
<http://example.org/anno11> <http://www.w3.org/ns/oa#hasTarget> _:b1 .
<http://example.org/anno11> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/anno11> <http://www.w3.org/ns/oa#motivatedBy> <http://www.w3.org/ns/oa#commenting> .
_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> <http://www.w3.org/1999/02/22-rdf-syntax-ns#nil> .
_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.net/page4> .
_:b1 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Composite> .
_:b1 <http://www.w3.org/ns/activitystreams#items> _:b2 .
_:b3 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> _:b4 .
_:b3 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.org/page6> .
_:b0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#value> "These pages together provide evidence of the conspiracy" .
_:b0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#TextualBody> .

View file

@ -1,19 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno12",
"type": "Annotation",
"motivation": "tagging",
"body": {
"type": "TextualBody",
"value": "important"
},
"target": {
"type": "List",
"items": [
"http://example.com/book/page1",
"http://example.com/book/page2",
"http://example.com/book/page3",
"http://example.com/book/page4"
]
}
}

View file

@ -1,16 +0,0 @@
_:b0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#value> "important" .
_:b0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#TextualBody> .
_:b5 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.com/book/page4> .
_:b5 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> <http://www.w3.org/1999/02/22-rdf-syntax-ns#nil> .
<http://example.org/anno12> <http://www.w3.org/ns/oa#hasBody> _:b0 .
<http://example.org/anno12> <http://www.w3.org/ns/oa#motivatedBy> <http://www.w3.org/ns/oa#tagging> .
<http://example.org/anno12> <http://www.w3.org/ns/oa#hasTarget> _:b1 .
<http://example.org/anno12> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
_:b2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.com/book/page1> .
_:b2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> _:b3 .
_:b3 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.com/book/page2> .
_:b3 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> _:b4 .
_:b1 <http://www.w3.org/ns/activitystreams#items> _:b2 .
_:b1 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#List> .
_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.com/book/page3> .
_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> _:b5 .

View file

@ -1,16 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno13",
"type": "Annotation",
"motivation": "classifying",
"body": "http://example.org/vocab/art/portrait",
"target": {
"type": "Independents",
"items": [
"http://example.com/image1",
"http://example.net/image2",
"http://example.com/image4",
"http://example.org/image9"
]
}
}

View file

@ -1,14 +0,0 @@
_:b0 <http://www.w3.org/ns/activitystreams#items> _:b1 .
_:b0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Independents> .
_:b1 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> _:b2 .
_:b1 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.com/image1> .
<http://example.org/anno13> <http://www.w3.org/ns/oa#motivatedBy> <http://www.w3.org/ns/oa#classifying> .
<http://example.org/anno13> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/anno13> <http://www.w3.org/ns/oa#hasTarget> _:b0 .
<http://example.org/anno13> <http://www.w3.org/ns/oa#hasBody> <http://example.org/vocab/art/portrait> .
_:b3 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> _:b4 .
_:b3 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.com/image4> .
_:b2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> _:b3 .
_:b2 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.net/image2> .
_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#rest> <http://www.w3.org/1999/02/22-rdf-syntax-ns#nil> .
_:b4 <http://www.w3.org/1999/02/22-rdf-syntax-ns#first> <http://example.org/image9> .

View file

@ -1,16 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno14",
"type": "Annotation",
"creator": "http://example.org/user1",
"created": "2015-01-28T12:00:00Z",
"modified": "2015-01-29T09:00:00Z",
"generator": "http://example.org/client1",
"generated": "2015-02-04T12:00:00Z",
"body": {
"id": "http://example.net/review1",
"creator": "http://example.net/user2",
"created": "2014-06-02T17:00:00Z"
},
"target": "http://example.com/restaurant1"
}

View file

@ -1,10 +0,0 @@
<http://example.org/anno14> <http://www.w3.org/ns/oa#hasBody> <http://example.net/review1> .
<http://example.org/anno14> <http://purl.org/dc/terms/modified> "2015-01-29T09:00:00Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
<http://example.org/anno14> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/anno14> <http://www.w3.org/ns/activitystreams#generator> <http://example.org/client1> .
<http://example.org/anno14> <http://www.w3.org/ns/oa#hasTarget> <http://example.com/restaurant1> .
<http://example.org/anno14> <http://purl.org/dc/terms/created> "2015-01-28T12:00:00Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
<http://example.org/anno14> <http://purl.org/dc/terms/issued> "2015-02-04T12:00:00Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
<http://example.org/anno14> <http://purl.org/dc/terms/creator> <http://example.org/user1> .
<http://example.net/review1> <http://purl.org/dc/terms/creator> <http://example.net/user2> .
<http://example.net/review1> <http://purl.org/dc/terms/created> "2014-06-02T17:00:00Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .

View file

@ -1,20 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno15",
"type": "Annotation",
"creator": {
"id": "http://example.org/user1",
"type": "Person",
"name": "My Pseudonym",
"nickname": "pseudo",
"email_sha1": "58bad08927902ff9307b621c54716dcc5083e339"
},
"generator": {
"id": "http://example.org/client1",
"type": "Software",
"name": "Code v2.1",
"homepage": "http://example.org/client1/homepage1"
},
"body": "http://example.net/review1",
"target": "http://example.com/restaurant1"
}

View file

@ -1,12 +0,0 @@
<http://example.org/user1> <http://xmlns.com/foaf/0.1/nick> "pseudo" .
<http://example.org/user1> <http://xmlns.com/foaf/0.1/mbox_sha1sum> "58bad08927902ff9307b621c54716dcc5083e339" .
<http://example.org/user1> <http://xmlns.com/foaf/0.1/name> "My Pseudonym" .
<http://example.org/user1> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://xmlns.com/foaf/0.1/Person> .
<http://example.org/anno15> <http://purl.org/dc/terms/creator> <http://example.org/user1> .
<http://example.org/anno15> <http://www.w3.org/ns/oa#hasTarget> <http://example.com/restaurant1> .
<http://example.org/anno15> <http://www.w3.org/ns/oa#hasBody> <http://example.net/review1> .
<http://example.org/anno15> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/anno15> <http://www.w3.org/ns/activitystreams#generator> <http://example.org/client1> .
<http://example.org/client1> <http://xmlns.com/foaf/0.1/name> "Code v2.1" .
<http://example.org/client1> <http://xmlns.com/foaf/0.1/homepage> <http://example.org/client1/homepage1> .
<http://example.org/client1> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/activitystreams#Application> .

View file

@ -1,12 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno16",
"type": "Annotation",
"audience": {
"id": "http://example.edu/roles/teacher",
"type": "schema:EducationalAudience",
"schema:educationalRole": "teacher"
},
"body": "http://example.net/classnotes1",
"target": "http://example.com/textbook1"
}

View file

@ -1,6 +0,0 @@
<http://example.edu/roles/teacher> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://schema.org/EducationalAudience> .
<http://example.edu/roles/teacher> <http://schema.org/educationalRole> "teacher" .
<http://example.org/anno16> <http://www.w3.org/ns/oa#hasBody> <http://example.net/classnotes1> .
<http://example.org/anno16> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/anno16> <http://schema.org/audience> <http://example.edu/roles/teacher> .
<http://example.org/anno16> <http://www.w3.org/ns/oa#hasTarget> <http://example.com/textbook1> .

View file

@ -1,12 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno17",
"type": "Annotation",
"motivation": "commenting",
"body": "http://example.net/comment1",
"target": {
"id": "http://example.com/video1",
"type": "Video",
"accessibility": "captions"
}
}

View file

@ -1,6 +0,0 @@
<http://example.org/anno17> <http://www.w3.org/ns/oa#hasBody> <http://example.net/comment1> .
<http://example.org/anno17> <http://www.w3.org/ns/oa#hasTarget> <http://example.com/video1> .
<http://example.org/anno17> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/anno17> <http://www.w3.org/ns/oa#motivatedBy> <http://www.w3.org/ns/oa#commenting> .
<http://example.com/video1> <http://schema.org/accessibilityFeature> "captions" .
<http://example.com/video1> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://purl.org/dc/dcmitype/MovingImage> .

View file

@ -1,19 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno18",
"type": "Annotation",
"motivation": "bookmarking",
"body": [
{
"type": "TextualBody",
"value": "readme",
"purpose": "tagging"
},
{
"type": "TextualBody",
"value": "A good description of the topic that bears further investigation",
"purpose": "describing"
}
],
"target": "http://example.com/page1"
}

View file

@ -1,11 +0,0 @@
<http://example.org/anno18> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/anno18> <http://www.w3.org/ns/oa#hasTarget> <http://example.com/page1> .
<http://example.org/anno18> <http://www.w3.org/ns/oa#motivatedBy> <http://www.w3.org/ns/oa#bookmarking> .
<http://example.org/anno18> <http://www.w3.org/ns/oa#hasBody> _:b1 .
<http://example.org/anno18> <http://www.w3.org/ns/oa#hasBody> _:b0 .
_:b1 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#TextualBody> .
_:b1 <http://www.w3.org/1999/02/22-rdf-syntax-ns#value> "A good description of the topic that bears further investigation" .
_:b1 <http://www.w3.org/ns/oa#hasPurpose> <http://www.w3.org/ns/oa#describing> .
_:b0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#TextualBody> .
_:b0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#value> "readme" .
_:b0 <http://www.w3.org/ns/oa#hasPurpose> <http://www.w3.org/ns/oa#tagging> .

View file

@ -1,11 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno19",
"type": "Annotation",
"rights": "https://creativecommons.org/publicdomain/zero/1.0/",
"body": {
"id": "http://example.net/review1",
"rights": "http://creativecommons.org/licenses/by-nc/4.0/"
},
"target": "http://example.com/product1"
}

View file

@ -1,5 +0,0 @@
<http://example.net/review1> <http://purl.org/dc/terms/rights> <http://creativecommons.org/licenses/by-nc/4.0/> .
<http://example.org/anno19> <http://purl.org/dc/terms/rights> <https://creativecommons.org/publicdomain/zero/1.0/> .
<http://example.org/anno19> <http://www.w3.org/ns/oa#hasBody> <http://example.net/review1> .
<http://example.org/anno19> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <http://www.w3.org/ns/oa#Annotation> .
<http://example.org/anno19> <http://www.w3.org/ns/oa#hasTarget> <http://example.com/product1> .

View file

@ -1,17 +0,0 @@
{
"@context": "http://www.w3.org/ns/anno.jsonld",
"id": "http://example.org/anno2",
"type": "Annotation",
"body": {
"id": "http://example.org/analysis1.mp3",
"format": "audio/mpeg",
"language": "fr"
},
"target": {
"id": "http://example.gov/patent1.pdf",
"format": "application/pdf",
"language": ["en", "ar"],
"textDirection": "ltr",
"processingLanguage": "en"
}
}

Some files were not shown because too many files have changed in this diff Show more