Skip to content

Master gratuation project: AI based insecure code detection on Q&A sites.

License

Notifications You must be signed in to change notification settings

ecna/QAInsecureCodeDetection

Repository files navigation

AI based insecure code detection on Q&A sites (Master gratuation project).

Chrome Extension, TypeScript and Visual Studio Code

Prerequisites

Option

Includes the following

  • TypeScript
  • Webpack
  • React
  • Jest

Project Structure

  • src/typescript: TypeScript source files
  • src/assets: static files
  • dist: Chrome Extension directory
  • dist/js: Generated JavaScript files

Setup

Install dependencies

npm install

Build

npm run build

Build in watch mode

npm run watch

or

type Ctrl + Shift + B (vsCode)

Load extension to chrome

Load dist directory

Test

npx jest or npm run test

Run en debug (vsCode)

Start context_page.html in Chrome with extension installed through launch-file:

'Run and Debug' tab (Ctrl+Shift+D) -> play

or

vsCode menu => Run => 'Start Debugging' (F5)

When started, 'context_page_minimal.html' opens in Chrome (Chrome must be installed). To open 'context_page.html', change the 'file' parameter in 'launch.json'.

If the extension is in 'dataset mode', the HTML page is cleared, and dataset information is injected into the page.

LLM API keys can be added through the extension menu (right-click the extension icon and select "Options").

Menu dialog:

Extension Menu

In this menu, you can also activate 'dataset mode' and switch between LLMs and strategies.

Switching to another LLM and activating 'dataset mode' can also be done through the popup dialog (click the extension icon).

Popup dialog:

Extension Menu.

Master Thesis

This repository is the technical result of the master thesis: "Insecure code detection with LLMs", which can be downloaded here.

About

Master gratuation project: AI based insecure code detection on Q&A sites.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published