skip to Main Content

I’ve been using aws eventbridge to create a pipe that has a kinesis source, everything good with the source, filtering and enrichment. But when i try to send the information to redshift it just won’t work. The logs say that everything is ok but the new data is not showing in redshift.

The query i’m using is INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('11234124321', '13123123' ,'test', $.data); data has only a ‘hello world!’ string to test. This insert is not working but if instead i use something like INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('11234124321', '13123123' ,'test', 'test1'); the new row is inserted with no problems in redshift.
I’ve been reading documentation for hours and even gave aws ia a try but with no luck. Also tried to use target input transformer with the same result.

A list of everything i’ve tried:

INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('5', json_extract_path(to_json('$context.event.detail'), '$.data.status'), 'test', 'test1')
INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('6', $.data.status, 'test', 'test1')
INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('7', $.status, 'test', 'test1')
INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('8', json_extract_path(to_json('$context.event.detail'), '$.status'), 'test', 'test1')
INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('9', json_extract_path(to_json('event.detail'), '$.status'),to_json('event.detail'), 'test', 'test1')
INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('10', JSON_EXTRACT_PATH_TEXT('$context.event.detail','data', 'status'), 'test', 'test1')
INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('11', json_extract_path(input, 'status') ,'test', 'test1');
INSERT INTO public.event_bridge_test (id, status, test, test1) VALUES ('11234124321', '13123123' ,'test', TO_CHAR($));

2

Answers


  1. This sounds like it might be a COMMIT issue (everything works but no data is showing up). The database doesn’t permanently store the changes w/o a COMMIT and doesn’t allow another session to see changes that are committed until that session is brought up to date with a COMMIT or ROLLBACK.

    So this leads to several questions.

    • What logs are you reviewing?
    • How are you checking that the data changes aren’t in the database?
    • Can you see the SQL you are issuing in the redshift catalog tables (like stl_querytext)?
    • How are you inserting the set of data that is working? Using a workbench session?
    • What COMMITs are you sending and what COMMIT settings are there on your connections to the database?

    On a side note and since you are using Kinesis, there’s a good chance you expect this solution to scale to high volume. The issue you will run into is that inserting data this way into Redshift is very non-performant and can actually over stress the cluster making other work slow down. You should look into batching the data and loading through S3 using a COPY like firehose does.

    Login or Signup to reply.
  2. @luis-godínez Did you find an answer to your question?
    I’m having the same issue with getting a success message in cloudwatch logs, but no data inserted into redshift

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search