v1.4.0
版本发布时间: 2020-04-22 00:34:58
confluentinc/confluent-kafka-python最新发布版本:v2.5.3(2024-09-02 22:24:36)
Confluent's Python client for Apache Kafka
v1.4.0 is a feature release:
- KIP-98: Transactional Producer API
- KIP-345: Static consumer group membership (by @rnpridgeon)
- KIP-511: Report client software name and version to broker
- Generic Serde API (experimental)
- New AvroSerializer and AvroDeserializer implementations including configurable subject name strategies.
- JSON Schema support (For Schema Registry)
- Protobuf support (For Schema Registry)
confluent-kafka-python is based on librdkafka v1.4.0, see the librdkafka v1.4.0 release notes for a complete list of changes, enhancements, fixes and upgrade considerations.
Transactional Producer API
Release v1.4.0 for confluent-kafka-python adds complete Exactly-Once-Semantics (EOS) functionality, supporting the idempotent producer (since v1.0.0), a transaction-aware consumer (since v1.2.0) and full producer transaction support (v1.4.0).
This enables developers to create Exactly-Once applications with Apache Kafka.
See the Transactions in Apache Kafka page for an introduction and check the transactions example.
Generic Serializer API
Release v1.4.0 introduces a new, experimental, API which adds serialization capabilities to Kafka Producer and Consumer. This feature provides the ability to configure Producer/Consumer key and value serializers/deserializers independently. Previously all serialization must be handled prior to calling Producer.produce
and after Consumer.poll
.
This release ships with 3 built-in, Java compatible, standard serializer and deserializer classes:
Name | Type | Format |
---|---|---|
Double | float | IEEE 764 binary64 |
Integer | int | int32 |
String | Unicode | bytes* |
* The StringSerializer codec is configurable and supports any one of Python's standard encodings. If left unspecified 'UTF-8' will be used.
Additional serialization implementations are possible through the extension of the Serializer and Deserializer base classes.
See avro_producer.py and avro_consumer.py for example usage.
Avro, Protobuf and JSON Schema Serializers
Release v1.4.0 for confluent-kafka-python adds support for two new Schema Registry serialization formats with its Generic Serialization API; JSON and Protobuf. A new set of Avro Serialization classes have also been added to conform to the new API.
Format | Serializer Example | Deserializer Example |
---|---|---|
Avro | avro_producer.py | avro_consumer.py |
JSON | json_producer.py | json_consumer.py |
Protobuf | protobuf_producer.py | protobuf_consumer.py |
Security fixes
Two security issues have been identified in the SASL SCRAM protocol handler:
- The client nonce, which is expected to be a random string, was a static string.
- If
sasl.username
andsasl.password
contained characters that needed escaping, a buffer overflow and heap corruption would occur. This was protected, but too late, by an assertion.
Both of these issues are fixed in this release.
Enhancements
- Bump OpenSSL to v1.0.2u
- Bump monitoring-interceptors to v0.11.3
Fixes
General:
- Remove unused variable from README example (@qzyse2017, #691)
- Add delivery report to Avro example (#742)
- Update asyncio example companion blog URL (@filippovitale, #760)
Schema Registry/Avro:
- Trim trailing
/
from Schema Registry base URL (@IvanProdaiko94 , #749) - Make automatic Schema registration optional (@dpfeif, #718)
- Bump Apache Avro to 1.9.2[.1] (@RyanSkraba, #779)
- Correct the SchemaRegistry authentication for SASL_INHERIT (@abij, #733)
Also see the librdkafka v1.4.0 release notes for fixes to the underlying client implementation.