-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(pubsub): support kafka #7032
Merged
Merged
Changes from all commits
Commits
Show all changes
20 commits
Select commit
Hold shift + click to select a range
4d3e996
feat: extend pubsub proto
bzp2010 cb5e29b
feat: add kafka upstream scheme
bzp2010 0db7413
chore: update kafka dependency
bzp2010 ac25c0b
feat: add kafka support
bzp2010 b2af519
chore: install pubsub.kafka
bzp2010 d1f7280
test: add basic cases
bzp2010 fb1f202
docs: add documentation
bzp2010 797935d
fix: typo
bzp2010 141aaa6
fix: ensure pre-produce kafka messages
bzp2010 b7bebcf
test: add more cases coverage
bzp2010 b9d3ace
fix: lint
bzp2010 8ed6673
test: add kafka upstream case
bzp2010 78b7f88
chore: check int64 convert bound
bzp2010 3353581
chore: ensure pubsub proto use absolute link
bzp2010 c65dac2
fix: replace all Pub-Sub to PubSub
bzp2010 aec71b7
docs: fix style
bzp2010 438d982
docs: fix style
bzp2010 bc475da
ci(traffic-split): improve ci stability (#7055)
soulbird e5b258f
fix: review
bzp2010 db49fcb
Merge master
bzp2010 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,137 @@ | ||
-- | ||
-- Licensed to the Apache Software Foundation (ASF) under one or more | ||
-- contributor license agreements. See the NOTICE file distributed with | ||
-- this work for additional information regarding copyright ownership. | ||
-- The ASF licenses this file to You under the Apache License, Version 2.0 | ||
-- (the "License"); you may not use this file except in compliance with | ||
-- the License. You may obtain a copy of the License at | ||
-- | ||
-- http://www.apache.org/licenses/LICENSE-2.0 | ||
-- | ||
-- Unless required by applicable law or agreed to in writing, software | ||
-- distributed under the License is distributed on an "AS IS" BASIS, | ||
-- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
-- See the License for the specific language governing permissions and | ||
-- limitations under the License. | ||
-- | ||
|
||
local core = require("apisix.core") | ||
local bconsumer = require("resty.kafka.basic-consumer") | ||
local ffi = require("ffi") | ||
local C = ffi.C | ||
local tostring = tostring | ||
local type = type | ||
local ipairs = ipairs | ||
local str_sub = string.sub | ||
|
||
ffi.cdef[[ | ||
int64_t atoll(const char *num); | ||
]] | ||
|
||
|
||
local _M = {} | ||
|
||
|
||
-- Handles the conversion of 64-bit integers in the lua-protobuf. | ||
-- | ||
-- Because of the limitations of luajit, we cannot use native 64-bit | ||
-- numbers, so pb decode converts int64 to a string in #xxx format | ||
-- to avoid loss of precision, by this function, we convert this | ||
-- string to int64 cdata numbers. | ||
local function pb_convert_to_int64(src) | ||
if type(src) == "string" then | ||
-- the format is #1234, so there is a small minimum length of 2 | ||
if #src < 2 then | ||
return 0 | ||
end | ||
return C.atoll(ffi.cast("char *", src) + 1) | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Let's check src length to avoid out of bound There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. added |
||
else | ||
return src | ||
end | ||
end | ||
|
||
|
||
-- Takes over requests of type kafka upstream in the http_access phase. | ||
function _M.access(api_ctx) | ||
local pubsub, err = core.pubsub.new() | ||
if not pubsub then | ||
core.log.error("failed to initialize pubsub module, err: ", err) | ||
core.response.exit(400) | ||
tzssangglass marked this conversation as resolved.
Show resolved
Hide resolved
|
||
return | ||
end | ||
|
||
local up_nodes = api_ctx.matched_upstream.nodes | ||
|
||
-- kafka client broker-related configuration | ||
local broker_list = {} | ||
for i, node in ipairs(up_nodes) do | ||
broker_list[i] = { | ||
host = node.host, | ||
port = node.port, | ||
} | ||
end | ||
|
||
local client_config = {refresh_interval = 30 * 60 * 1000} | ||
tzssangglass marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
-- load and create the consumer instance when it is determined | ||
-- that the websocket connection was created successfully | ||
local consumer = bconsumer:new(broker_list, client_config) | ||
tzssangglass marked this conversation as resolved.
Show resolved
Hide resolved
|
||
|
||
pubsub:on("cmd_kafka_list_offset", function (params) | ||
-- The timestamp parameter uses a 64-bit integer, which is difficult | ||
-- for luajit to handle well, so the int64_as_string option in | ||
-- lua-protobuf is used here. Smaller numbers will be decoded as | ||
-- lua number, while overly larger numbers will be decoded as strings | ||
-- in the format #number, where the # symbol at the beginning of the | ||
-- string will be removed and converted to int64_t with the atoll function. | ||
local timestamp = pb_convert_to_int64(params.timestamp) | ||
|
||
local offset, err = consumer:list_offset(params.topic, params.partition, timestamp) | ||
|
||
if not offset then | ||
return nil, "failed to list offset, topic: " .. params.topic .. | ||
", partition: " .. params.partition .. ", err: " .. err | ||
end | ||
|
||
offset = tostring(offset) | ||
return { | ||
kafka_list_offset_resp = { | ||
offset = str_sub(offset, 1, #offset - 2) | ||
} | ||
} | ||
end) | ||
|
||
pubsub:on("cmd_kafka_fetch", function (params) | ||
local offset = pb_convert_to_int64(params.offset) | ||
|
||
local ret, err = consumer:fetch(params.topic, params.partition, offset) | ||
if not ret then | ||
return nil, "failed to fetch message, topic: " .. params.topic .. | ||
", partition: " .. params.partition .. ", err: " .. err | ||
end | ||
|
||
-- split into multiple messages when the amount of data in | ||
-- a single batch is too large | ||
local messages = ret.records | ||
|
||
-- special handling of int64 for luajit compatibility | ||
for _, message in ipairs(messages) do | ||
local timestamp = tostring(message.timestamp) | ||
message.timestamp = str_sub(timestamp, 1, #timestamp - 2) | ||
local offset = tostring(message.offset) | ||
message.offset = str_sub(offset, 1, #offset - 2) | ||
end | ||
|
||
return { | ||
kafka_fetch_resp = { | ||
messages = messages, | ||
}, | ||
} | ||
end) | ||
|
||
-- start processing client commands | ||
pubsub:wait() | ||
end | ||
|
||
|
||
return _M |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -216,7 +216,8 @@ | |
"type": "category", | ||
"label": "PubSub", | ||
"items": [ | ||
"pubsub" | ||
"pubsub", | ||
"pubsub/kafka" | ||
] | ||
}, | ||
{ | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can remove cmd_empty which is test-only? Using
cmd_kafka_fetch
in pubsub.t is enough.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@spacewander This would make the
pubsub
module test the relevant code that relies onkafka
, and I'm not sure if I should do that.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about adding a comment to show that this cmd is test-only?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
After rechecking, I found that CmdEmpty has added a test-only flag.
apisix/apisix/include/apisix/model/pubsub.proto
Lines 35 to 39 in d955009