Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Please check whether you have installed the "@aws-sdk/crc64-nvme-crt" package explicitly" Error on AWS Lambda #6855

Open
3 of 4 tasks
yixiangding opened this issue Jan 29, 2025 · 7 comments
Assignees
Labels
bug This issue is a bug. closing-soon This issue will automatically close in 4 days unless further comments are made. p2 This is a standard priority issue

Comments

@yixiangding
Copy link

yixiangding commented Jan 29, 2025

Checkboxes for prior research

Describe the bug

Hello AWS team,

We were experiencing a production incidence earlier today caused by an error "Please check whether you have installed the "@aws-sdk/crc64-nvme-crt" package explicitly." thrown from our code running AWS SDK on Lambda Node.js 18.x.

What's strange is, it only affects around 50% of our user base in the production. Also, it's only reproducible on 1 alias out of total 3 aliases (prod alias out of dev, staging, and prod), which led me to think this could be related to the Lambda underlying dependency management.

Among the 2 Lambda functions, one is remedied by simply npm i @aws-sdk/crc64-nvme-crt and redeploy. However, the same solution along with adding require("@aws-sdk/crc64-nvme-crt") did not work for the other Lambda and we had to take the service down.

Regression Issue

  • Select this option if this issue appears to be a regression.

SDK version number

"@aws-sdk/client-dynamodb": "^3.449.0", "@aws-sdk/client-kms": "^3.445.0", "@aws-sdk/client-lambda": "^3.418.0", "@aws-sdk/client-s3": "^3.449.0", "@aws-sdk/client-sns": "^3.449.0", "@aws-sdk/crc64-nvme-crt": "^3.735.0", "@aws-sdk/signature-v4-crt": "^3.734.0", "@aws-sdk/util-dynamodb": "^3.449.0"

Which JavaScript Runtime is this issue in?

Node.js

Details of the browser/Node.js/ReactNative version

Lambda Node.js 18.x

Reproduction Steps

Unfortunately still unknown. There is no code change for 2 weeks and suddenly started today.

Observed Behavior

Error: Please check whether you have installed the "@aws-sdk/crc64-nvme-crt" package explicitly. 
You must also register the package by calling [require("@aws-sdk/crc64-nvme-crt");] or an ESM equivalent such as [import "@aws-sdk/crc64-nvme-crt";]. 
For more information please go to https://github.com/aws/aws-sdk-js-v3#functionality-requiring-aws-common-runtime-crt
    at selectChecksumAlgorithmFunction (/var/task/node_modules/@aws-sdk/middleware-flexible-checksums/dist-cjs/index.js:209:15)
    at validateChecksumFromResponse (/var/task/node_modules/@aws-sdk/middleware-flexible-checksums/dist-cjs/index.js:411:35)
    at /var/task/node_modules/@aws-sdk/middleware-flexible-checksums/dist-cjs/index.js:466:11
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async /var/task/node_modules/@aws-sdk/middleware-sdk-s3/dist-cjs/index.js:174:20
    at async /var/task/node_modules/@smithy/middleware-serde/dist-cjs/index.js:33:24
    at async /var/task/node_modules/@aws-sdk/middleware-sdk-s3/dist-cjs/index.js:483:18
    at async /var/task/node_modules/@smithy/middleware-retry/dist-cjs/index.js:321:38
    at async /var/task/node_modules/@aws-sdk/middleware-sdk-s3/dist-cjs/index.js:109:22
    at async /var/task/node_modules/@aws-sdk/middleware-sdk-s3/dist-cjs/index.js:136:14

Expected Behavior

No error

Possible Solution

Came across this issue, wondering if this could be related? #6822 (comment)?

Additional Information/Context

Any suggestions will be appreciated

@yixiangding yixiangding added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Jan 29, 2025
@aBurmeseDev
Copy link
Member

aBurmeseDev commented Jan 29, 2025

Hi @yixiangding - thanks for reporting this issue. It looks like this was due to the recent change in S3 default integrity protections.

Could you please upgrade to v3.735.0 or later version and verify the fix? Please let us know if issue persists.

@aBurmeseDev aBurmeseDev self-assigned this Jan 29, 2025
@aBurmeseDev aBurmeseDev added response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. p2 This is a standard priority issue and removed needs-triage This issue or PR still needs to be triaged. labels Jan 29, 2025
@peter-at-work
Copy link

It's unfortunate that this recent change in S3 behavior will necessitate immediate Production patches for certain use-cases. If there is an S3-reading application that was unluckily built with any of the bad S3 Client versions, it would be running okay in Production until the S3 bucket receives new objects with the integrity protections. The S3-reading application suddenly breaks.

@yixiangding
Copy link
Author

Thanks for the quick facts guys! That explains a lot of puzzles we've seen today...

I'll try 3.735.0. Meanwhile @aBurmeseDev do you have an approximate range of the SDK versions that are considered "Bad versions", so that from our side, at least we can prevent this issue from spreading to other services we have?

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 30, 2025
@kuhe
Copy link
Contributor

kuhe commented Jan 30, 2025

We don't have a full list, but the version that you're using which encountered this error should be showing a deprecation message during installation, emitted from the package @aws-sdk/middleware-flexible-checksums.

@colingagnon
Copy link

Tripped over this today with luckily, a staging deploy, and not in production.

Same thing as author, installed npm i @aws-sdk/crc64-nvme-crt and adding import '@aws-sdk/crc64-nvme-crt' did not resolve the issue with our build.

I can confirm locking deps to 3.735.0 did resolve the issue (at least for us)

    "@aws-sdk/client-s3": "3.735.0",
    "@aws-sdk/crc64-nvme-crt": "3.735.0",
    "@aws-sdk/lib-storage": "3.735.0",
    "@aws-sdk/s3-request-presigner": "3.735.0",

@aBurmeseDev
Copy link
Member

Thanks all for reporting and apologies for any inconveniences caused. As previously mentioned, this should no longer be an issue starting v3.735.0 or later version. Please report back to us if issue persists for any reason and we'd be happy to help!

@aBurmeseDev aBurmeseDev added the closing-soon This issue will automatically close in 4 days unless further comments are made. label Jan 30, 2025
@nccho
Copy link

nccho commented Jan 31, 2025

We are operating two AWS Lambda services, both using the same version of Node.js: 16.x and @aws-sdk/client-s3: ^3.329.0
However, the error only occurs in one of the services.
We would like to understand the possible reasons for this discrepancy.

Please check whether you have installed the "@aws-sdk/crc64-nvme-crt" package explicitly.  
You must also register the package by calling [require("@aws-sdk/crc64-nvme-crt");] or an ESM equivalent such as [import "@aws-sdk/crc64-nvme-crt";].  
For more information please go to https://github.com/aws/aws-sdk-js-v3#functionality-requiring-aws-common-runtime-crt  

For reference, the AWS Lambda that encountered the error is now functioning correctly after installing @aws-sdk/client-s3 version 3.735.0.

Could passing the VersionId parameter in GetObjectCommandInput of @aws-sdk/client-s3 be related to the issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue is a bug. closing-soon This issue will automatically close in 4 days unless further comments are made. p2 This is a standard priority issue
Projects
None yet
Development

No branches or pull requests

6 participants