pyarrow.csv.CSVWriter#
- class pyarrow.csv.CSVWriter(sink, Schema schema, WriteOptions write_options=None, *, MemoryPool memory_pool=None)#
- Bases: - _CRecordBatchWriter- Writer to create a CSV file. - Parameters:
- sinkstr, path,pyarrow.OutputStreamor file-like object
- The location where to write the CSV data. 
- schemapyarrow.Schema
- The schema of the data to be written. 
- write_optionspyarrow.csv.WriteOptions
- Options to configure writing the CSV data. 
- memory_poolMemoryPool, optional
- Pool for temporary allocations. 
 
- sink
 - __init__(*args, **kwargs)#
 - Methods - __init__(*args, **kwargs)- close(self)- Close stream and write end-of-stream 0 marker. - write(self, table_or_batch)- Write RecordBatch or Table to stream. - write_batch(self, RecordBatch batch[, ...])- Write RecordBatch to stream. - write_table(self, Table table[, max_chunksize])- Write Table to stream in (contiguous) RecordBatch objects. - Attributes - Current IPC write statistics. - close(self)#
- Close stream and write end-of-stream 0 marker. 
 - stats#
- Current IPC write statistics. 
 - write(self, table_or_batch)#
- Write RecordBatch or Table to stream. - Parameters:
- table_or_batch{RecordBatch,Table}
 
- table_or_batch{
 
 - write_batch(self, RecordBatch batch, custom_metadata=None)#
- Write RecordBatch to stream. - Parameters:
- batchRecordBatch
- custom_metadatamapping or KeyValueMetadata
- Keys and values must be string-like / coercible to bytes 
 
- batch
 
 
 
    