Skip to content

FullStackWithLawrence/aws-openai

Repository files navigation

OpenAI Code Samples

FullStackWithLawrence
OpenAI LangChain Amazon AWS ReactJS Python Terraform
12-Factor Unit Tests GHA pushMain Status Auto Assign Release Notes License: AGPL v3 hack.d Lawrence McDaniel

A React + AWS Serverless full stack implementation of the example applications found in the official OpenAI API documentation. See this system architectural diagram for details. This is an instructional tool for the YouTube channel "Full Stack With Lawrence" and for University of British Columbia course, "Artificial Intelligence Cloud Technology Implementation".

Quickstart

Works with Linux, Windows and macOS environments.

  1. Verify project requirements: AWS Account and CLI access, Terraform, Python 3.11, NPM and Docker Compose.

  2. Review and edit the master Terraform configuration file.

  3. Run make and add your credentials to the newly created .env file in the root of the repo.

  4. Initialize, build and run the application.

git clone https://github.com/FullStackWithLawrence/aws-openai.git
make        # scaffold a .env file in the root of the repo
make init   # initialize Terraform, Python virtual environment and NPM
make build  # deploy AWS cloud infrastructure, build ReactJS web app
make run    # run the web app locally in your dev environment

Features

  • Complete OpenAI API: Deploys a production-ready API for integrating to OpenAI's complete suite of services, including ChatGTP, DALL·E, Whisper, and TTS.

  • LangChain Integration: A simple API endpoint for building context-aware, reasoning applications with LangChain’s flexible abstractions and AI-first toolkit. Use this endpoint to develop a wide range of applications, from chatbots to question-answering systems.

  • Dynamic ChatGPT Prompting: Simple Terraform templates to create highly presonalized ChatBots. Program and skin your own custom chat apps in minutes.

  • Function Calling: OpenAI's most advanced integration feature to date. OpenAI API Function Calling is a feature that enables developers to integrate their own custom Python functions into the processing of chat responses. For example, when a chatbot powered by OpenAI's GPT-3 model is generating responses, it can call these custom Python functions to perform specific tasks or computations, and then include the results of these functions in its responses. This powerful feature can be used to create more dynamic and interactive chatbots that can perform tasks such as fetching real-time data, performing calculations, or interacting with other APIs or services. See the Python source code for additional documentation and examples, including, "get_current_weather()" from The official OpenAI API documentation

  • Function Calling Plugins: We created our own yaml-based "plugin" model. See this example plugin and this documentation for details, or try it out on this live site. Yaml templates can be stored locally or served from a secure AWS S3 bucket. You'll find set of fun example plugins here.

Marv

ReactJS chat application

Complete source code and documentation is located here.

React app that leverages Vite.js, @chatscope/chat-ui-kit-react, and react-pro-sidebar.

Webapp Key features

  • robust, highly customizable chat features
  • A component model for implementing your own highly personalized OpenAI apps
  • Skinnable UI for each app
  • Includes default assets for each app
  • Small compact code base
  • Robust error handling for non-200 response codes from the custom REST API
  • Handles direct text input as well as file attachments
  • Info link to the OpenAI API official code sample
  • Build-deploy managed with Vite

Custom OpenAI REST API Backend

Complete documentation is located here. Python code is located here

A REST API implementing each of the 30 example applications from the official OpenAI API Documentation using a modularized Terraform approach. Leverages OpenAI's suite of AI models, including GPT-3.5, GPT-4, DALL·E, Whisper, Embeddings, and Moderation.

API Key features

  • OpenAI API library for Python. LangChain enabled API endpoints where designated.
  • Pydantic based CI-CD friendly Settings configuration class that consistently and automatically manages Python Lambda initializations from multiple sources including bash environment variables, .env and terraform.tfvars files.
  • CloudWatch logging
  • Terraform fully automated and parameterized build. Usually builds your infrastructure in less than a minute.
  • Secure: uses AWS role-based security and custom IAM policies. Best practice handling of secrets and sensitive data in all environments (dev, test, CI-CD, prod). Proxy-based API that hides your OpenAI API calls and credentials. Runs on https with AWS-managed SSL/TLS certificate.
  • Excellent documentation
  • AWS serverless implementation. Free or nearly free in most cases
  • Deploy to a custom domain name

Requirements

Optional requirements:

Documentation

Detailed documentation for each endpoint is available here: Documentation

Support

To get community support, go to the official Issues Page for this project.

Good Coding Best Practices

This project demonstrates a wide variety of good coding best practices for managing mission-critical cloud-based micro services in a team environment, namely its adherence to 12-Factor Methodology. Please see this Code Management Best Practices for additional details.

We want to make this project more accessible to students and learners as an instructional tool while not adding undue code review workloads to anyone with merge authority for the project. To this end we've also added several pre-commit code linting and code style enforcement tools, as well as automated procedures for version maintenance of package dependencies, pull request evaluations, and semantic releases.

Contributing

We welcome contributions! There are a variety of ways for you to get involved, regardless of your background. In addition to Pull requests, this project would benefit from contributors focused on documentation and how-to video content creation, testing, community engagement, and stewards to help us to ensure that we comply with evolving standards for the ethical use of AI.

For developers, please see:

You can also contact Lawrence McDaniel directly. Code composition as of Feb-2024:

-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          29            732            722           2663
HCL                             30            352            714           2353
Markdown                        52            779              6           2344
YAML                            23            112            149           1437
JavaScript                      39            114            127           1088
JSX                              6             45             47            858
CSS                              5             32             14            180
make                             1             27             30            120
Text                             6             13              0            117
INI                              2             15              0             70
HTML                             2              1              0             65
Jupyter Notebook                 1              0            186             48
Bourne Shell                     5             17             55             47
TOML                             1              1              0             23
Dockerfile                       1              4              4              5
-------------------------------------------------------------------------------
SUM:                           203          2,244          2,054         11,418
-------------------------------------------------------------------------------