The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed because of a cast error
Error code:   DatasetGenerationCastError
Exception:    DatasetGenerationCastError
Message:      An error occurred while generating the dataset

All the data files must have the same columns, but at some point there are 4 new columns ({'name', 'parent', 'metadata', 'id'}) and 6 missing columns ({'source_file', 'module_name', 'methods_count', 'events_count', 'properties_count', 'content'}).

This happened while the json dataset builder was generating data using

hf://datasets/Aptlantis/nodejs-all.json/nodejs_granular.jsonl (at revision 1247cc36a89cc104a30319f693b01616953e6e4d), [/tmp/hf-datasets-cache/medium/datasets/80555121944970-config-parquet-and-info-Aptlantis-nodejs-all-json-43603334/hub/datasets--Aptlantis--nodejs-all.json/snapshots/1247cc36a89cc104a30319f693b01616953e6e4d/nodejs_documents.jsonl (origin=hf://datasets/Aptlantis/nodejs-all.json@1247cc36a89cc104a30319f693b01616953e6e4d/nodejs_documents.jsonl), /tmp/hf-datasets-cache/medium/datasets/80555121944970-config-parquet-and-info-Aptlantis-nodejs-all-json-43603334/hub/datasets--Aptlantis--nodejs-all.json/snapshots/1247cc36a89cc104a30319f693b01616953e6e4d/nodejs_granular.jsonl (origin=hf://datasets/Aptlantis/nodejs-all.json@1247cc36a89cc104a30319f693b01616953e6e4d/nodejs_granular.jsonl)]

Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1887, in _prepare_split_single
                  writer.write_table(table)
                File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 674, in write_table
                  pa_table = table_cast(pa_table, self._schema)
                             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2272, in table_cast
                  return cast_table_to_schema(table, schema)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/table.py", line 2218, in cast_table_to_schema
                  raise CastError(
              datasets.table.CastError: Couldn't cast
              id: string
              type: string
              parent: string
              name: string
              description: string
              metadata: string
              to
              {'source_file': Value('string'), 'module_name': Value('string'), 'type': Value('string'), 'description': Value('string'), 'methods_count': Value('int64'), 'properties_count': Value('int64'), 'events_count': Value('int64'), 'content': Value('string')}
              because column names don't match
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1347, in compute_config_parquet_and_info_response
                  parquet_operations = convert_to_parquet(builder)
                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 980, in convert_to_parquet
                  builder.download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 884, in download_and_prepare
                  self._download_and_prepare(
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 947, in _download_and_prepare
                  self._prepare_split(split_generator, **prepare_split_kwargs)
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1736, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1889, in _prepare_split_single
                  raise DatasetGenerationCastError.from_cast_error(
              datasets.exceptions.DatasetGenerationCastError: An error occurred while generating the dataset
              
              All the data files must have the same columns, but at some point there are 4 new columns ({'name', 'parent', 'metadata', 'id'}) and 6 missing columns ({'source_file', 'module_name', 'methods_count', 'events_count', 'properties_count', 'content'}).
              
              This happened while the json dataset builder was generating data using
              
              hf://datasets/Aptlantis/nodejs-all.json/nodejs_granular.jsonl (at revision 1247cc36a89cc104a30319f693b01616953e6e4d), [/tmp/hf-datasets-cache/medium/datasets/80555121944970-config-parquet-and-info-Aptlantis-nodejs-all-json-43603334/hub/datasets--Aptlantis--nodejs-all.json/snapshots/1247cc36a89cc104a30319f693b01616953e6e4d/nodejs_documents.jsonl (origin=hf://datasets/Aptlantis/nodejs-all.json@1247cc36a89cc104a30319f693b01616953e6e4d/nodejs_documents.jsonl), /tmp/hf-datasets-cache/medium/datasets/80555121944970-config-parquet-and-info-Aptlantis-nodejs-all-json-43603334/hub/datasets--Aptlantis--nodejs-all.json/snapshots/1247cc36a89cc104a30319f693b01616953e6e4d/nodejs_granular.jsonl (origin=hf://datasets/Aptlantis/nodejs-all.json@1247cc36a89cc104a30319f693b01616953e6e4d/nodejs_granular.jsonl)]
              
              Please either edit the data files to have matching columns, or separate them into different configurations (see docs at https://hf.co/docs/hub/datasets-manual-configuration#multiple-configurations)

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

source_file
string
module_name
string
type
string
description
string
methods_count
int64
properties_count
int64
events_count
int64
content
string
doc/api/all.markdown
timers
module
<p>All of the timer functions are globals. You do not need to <code>require()</code> this module in order to use them. </p>
8
0
0
{"description": "<p>All of the timer functions are globals. You do not need to <code>require()</code>\nthis module in order to use them.\n\n</p>\n", "methods": [{"type": "method", "module": "timers", "name": "setTimeout", "description": "<p>To schedule execution of a one-time <code>callback</code> after <code>delay</c...
doc/api/all.markdown
module
module
<p>Node has a simple module loading system. In Node, files and modules are in one-to-one correspondence. As an example, <code>foo.js</code> loads the module <code>circle.js</code> in the same directory. </p> <p>The contents of <code>foo.js</code>: </p> <pre><code>var circle = require(&#39;./circle.js&#39;); console...
0
0
0
{"description": "<p>Node has a simple module loading system. In Node, files and modules are in\none-to-one correspondence. As an example, <code>foo.js</code> loads the module\n<code>circle.js</code> in the same directory.\n\n</p>\n<p>The contents of <code>foo.js</code>:\n\n</p>\n<pre><code>var circle = require(&#39;....
doc/api/all.markdown
addons
module
<p>Addons are dynamically linked shared objects. They can provide glue to C and C++ libraries. The API (at the moment) is rather complex, involving knowledge of several libraries: </p> <ul> <li><p>V8 JavaScript, a C++ library. Used for interfacing with JavaScript: creating objects, calling functions, etc. Documented ...
0
0
0
{"description": "<p>Addons are dynamically linked shared objects. They can provide glue to C and\nC++ libraries. The API (at the moment) is rather complex, involving\nknowledge of several libraries:\n\n</p>\n<ul>\n<li><p>V8 JavaScript, a C++ library. Used for interfacing with JavaScript:\ncreating objects, calling func...
doc/api/all.markdown
util
module
<p>These functions are in the module <code>&#39;util&#39;</code>. Use <code>require(&#39;util&#39;)</code> to access them. </p>
13
0
0
{"description": "<p>These functions are in the module <code>&#39;util&#39;</code>. Use <code>require(&#39;util&#39;)</code> to access\nthem.\n\n\n</p>\n", "methods": [{"type": "method", "module": "util", "name": "format", "description": "<p>Returns a formatted string using the first argument as a <code>printf</code>-li...
doc/api/all.markdown
Events
module
<p>Many objects in Node emit events: a <code>net.Server</code> emits an event each time a peer connects to it, a <code>fs.readStream</code> emits an event when the file is opened. All objects which emit events are instances of <code>events.EventEmitter</code>. You can access this module by doing: <code>require(&quot;ev...
8
0
1
{"description": "<p>Many objects in Node emit events: a <code>net.Server</code> emits an event each time\na peer connects to it, a <code>fs.readStream</code> emits an event when the file is\nopened. All objects which emit events are instances of <code>events.EventEmitter</code>.\nYou can access this module by doing: <c...
doc/api/all.markdown
domain
module
<p>Domains provide a way to handle multiple different IO operations as a single group. If any of the event emitters or callbacks registered to a domain emit an <code>error</code> event, or throw an error, then the domain object will be notified, rather than losing the context of the error in the <code>process.on(&#39;...
7
1
0
{"description": "<p>Domains provide a way to handle multiple different IO operations as a\nsingle group. If any of the event emitters or callbacks registered to a\ndomain emit an <code>error</code> event, or throw an error, then the domain object\nwill be notified, rather than losing the context of the error in the\n<...
doc/api/all.markdown
buffer
module
<p>Pure JavaScript is Unicode friendly but not nice to binary data. When dealing with TCP streams or the file system, it&#39;s necessary to handle octet streams. Node has several strategies for manipulating, creating, and consuming octet streams. </p> <p>Raw data is stored in instances of the <code>Buffer</code> clas...
34
3
0
{"description": "<p>Pure JavaScript is Unicode friendly but not nice to binary data. When\ndealing with TCP streams or the file system, it&#39;s necessary to handle octet\nstreams. Node has several strategies for manipulating, creating, and\nconsuming octet streams.\n\n</p>\n<p>Raw data is stored in instances of the <...
doc/api/all.markdown
stream
module
<p>A stream is an abstract interface implemented by various objects in Node. For example a request to an HTTP server is a stream, as is stdout. Streams are readable, writable, or both. All streams are instances of [EventEmitter][] </p> <p>You can load the Stream base classes by doing <code>require(&#39;stream&#39;)</...
19
0
10
{"description": "<p>A stream is an abstract interface implemented by various objects in\nNode. For example a request to an HTTP server is a stream, as is\nstdout. Streams are readable, writable, or both. All streams are\ninstances of [EventEmitter][]\n\n</p>\n<p>You can load the Stream base classes by doing <code>requ...
doc/api/all.markdown
crypto
module
<pre><code>Stability: 2 - Unstable; API changes are being discussed for future versions. Breaking changes will be minimized. See below.</code></pre> <p>Use <code>require(&#39;crypto&#39;)</code> to access this module. </p> <p>The crypto module offers a way of encapsulating secure credentials to be used as part of a ...
40
1
0
{"description": "<pre><code>Stability: 2 - Unstable; API changes are being discussed for\nfuture versions. Breaking changes will be minimized. See below.</code></pre>\n<p>Use <code>require(&#39;crypto&#39;)</code> to access this module.\n\n</p>\n<p>The crypto module offers a way of encapsulating secure credentials to...
doc/api/all.markdown
tls_(ssl)
module
<p>Use <code>require(&#39;tls&#39;)</code> to access this module. </p> <p>The <code>tls</code> module uses OpenSSL to provide Transport Layer Security and/or Secure Socket Layer: encrypted stream communication. </p> <p>TLS/SSL is a public/private key infrastructure. Each client and each server must have a private key...
11
8
6
{"description": "<p>Use <code>require(&#39;tls&#39;)</code> to access this module.\n\n</p>\n<p>The <code>tls</code> module uses OpenSSL to provide Transport Layer Security and/or\nSecure Socket Layer: encrypted stream communication.\n\n</p>\n<p>TLS/SSL is a public/private key infrastructure. Each client and each\nserve...
doc/api/all.markdown
stringdecoder
module
<p>To use this module, do <code>require(&#39;string_decoder&#39;)</code>. StringDecoder decodes a buffer to a string. It is a simple interface to <code>buffer.toString()</code> but provides additional support for utf8. </p> <pre><code>var StringDecoder = require(&#39;string_decoder&#39;).StringDecoder; var decoder = n...
2
0
0
{"description": "<p>To use this module, do <code>require(&#39;string_decoder&#39;)</code>. StringDecoder decodes a\nbuffer to a string. It is a simple interface to <code>buffer.toString()</code> but provides\nadditional support for utf8.\n\n</p>\n<pre><code>var StringDecoder = require(&#39;string_decoder&#39;).StringDe...
End of preview.