pyarrow.csv.CSVWriter#

class pyarrow.csv.CSVWriter(sink, Schema schema, WriteOptions write_options=None, *, MemoryPool memory_pool=None)#

Bases: _CRecordBatchWriter

Writer to create a CSV file.

Parameters:
sinkstr, path, pyarrow.OutputStream or file-like object

The location where to write the CSV data.

schemapyarrow.Schema

The schema of the data to be written.

write_optionspyarrow.csv.WriteOptions

Options to configure writing the CSV data.

memory_poolMemoryPool, optional

Pool for temporary allocations.

__init__(*args, **kwargs)#

Methods

__init__(*args, **kwargs)

close(self)

Close stream and write end-of-stream 0 marker.

write(self, table_or_batch)

Write RecordBatch or Table to stream.

write_batch(self, RecordBatch batch[, ...])

Write RecordBatch to stream.

write_table(self, Table table[, max_chunksize])

Write Table to stream in (contiguous) RecordBatch objects.

Attributes

stats

Current IPC write statistics.

close(self)#

Close stream and write end-of-stream 0 marker.

stats#

Current IPC write statistics.

write(self, table_or_batch)#

Write RecordBatch or Table to stream.

Parameters:
table_or_batch{RecordBatch, Table}
write_batch(self, RecordBatch batch, custom_metadata=None)#

Write RecordBatch to stream.

Parameters:
batchRecordBatch
custom_metadatamapping or KeyValueMetadata

Keys and values must be string-like / coercible to bytes

write_table(self, Table table, max_chunksize=None)#

Write Table to stream in (contiguous) RecordBatch objects.

Parameters:
tableTable
max_chunksizeint, default None

Maximum size for RecordBatch chunks. Individual chunks may be smaller depending on the chunk layout of individual columns.