Dataset Viewer
Duplicate
The dataset viewer is not available for this split.
Cannot extract the features (columns) for the split 'train' of the config 'default' of the dataset.
Error code:   FeaturesError
Exception:    ArrowInvalid
Message:      Schema at index 4 was different: 
action: list<item: double>
observation.state: list<item: double>
timestamp: double
frame_index: int64
episode_index: int64
index: int64
task_index: int64
vs
name: string
num_episodes: int64
total_frames: int64
source_repositories: list<item: string>
episodes: list<item: struct<episode_index: int64, original_repo: string, original_episode: int64, episode_data_keys: list<item: string>, length: int64>>
format_version: string
created_with: string
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 228, in compute_first_rows_from_streaming_response
                  iterable_dataset = iterable_dataset._resolve_features()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 3422, in _resolve_features
                  features = _infer_features_from_batch(self.with_format(None)._head())
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2187, in _head
                  return next(iter(self.iter(batch_size=n)))
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2391, in iter
                  for key, example in iterator:
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1882, in __iter__
                  for key, pa_table in self._iter_arrow():
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1904, in _iter_arrow
                  yield from self.ex_iterable._iter_arrow()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 527, in _iter_arrow
                  yield new_key, pa.Table.from_batches(chunks_buffer)
                File "pyarrow/table.pxi", line 4116, in pyarrow.lib.Table.from_batches
                File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status
              pyarrow.lib.ArrowInvalid: Schema at index 4 was different: 
              action: list<item: double>
              observation.state: list<item: double>
              timestamp: double
              frame_index: int64
              episode_index: int64
              index: int64
              task_index: int64
              vs
              name: string
              num_episodes: int64
              total_frames: int64
              source_repositories: list<item: string>
              episodes: list<item: struct<episode_index: int64, original_repo: string, original_episode: int64, episode_data_keys: list<item: string>, length: int64>>
              format_version: string
              created_with: string

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/datasets-cards)

my-cleaned-dataset-2

This is a merged LeRobot dataset created by combining selected episodes from multiple source datasets.

Dataset Information

  • Total Episodes: 4
  • Total Frames: 4
  • Source Repositories: DanqingZ/so100_test_6, DanqingZ/eval_act_so100_test
  • Format Version: 1.0
  • Created With: pathonai-backend

Episode Details

  • Episode 0: From DanqingZ/so100_test_6 (original episode 0) - 1 frames
  • Episode 1: From DanqingZ/so100_test_6 (original episode 1) - 1 frames
  • Episode 2: From DanqingZ/eval_act_so100_test (original episode 1) - 1 frames
  • Episode 3: From DanqingZ/eval_act_so100_test (original episode 3) - 1 frames

Usage

from lerobot.common.datasets.lerobot_dataset import LeRobotDataset

# Load the dataset
dataset = LeRobotDataset("DanqingZ/my-cleaned-dataset-2")

# Access episode data
episode_0 = dataset.get_episode(0)

License

This dataset combines data from multiple sources. Please check the licenses of the original datasets:

  • DanqingZ/so100_test_6
  • DanqingZ/eval_act_so100_test
Downloads last month
1