Skip to content

MaheshSQL/miscellaneous

Repository files navigation

This repository contains miscellaneous script collections.

  • /Azure Functions: This directory has Azure Function App projects.

    • /AzureFunctions/Proj_LLMSQL:

      • This Azure Function project contains Azure Function/s that articulate a response to natural language question on dataset within an Azure SQL Database tables. The function makes use of LangChain SQLDatabaseToolkit backed by gpt-35-turbo-16k (can work with 4k version too) LLM.
      • You require Azure OpenAI and Azure SQL Database resources provisioned beforehand.
      • Ensure the database contains tables (with descriptive names) that you would like to ask questions to.
      • This function can be called by any front-end application to render output.
      • To run this Azure Functions project locally
        • Install necessary prerequsites for Azure Functions
        • Create a local python virtual environment.
        • pip install -r requirements.txt
        • Rename local.settings_template.json file to local.settings.json and update the values in this file.
        • Open this repository in VS Code.
        • Open VS Code terminal and navigate to \miscellaneous\Azure Functions\Proj_LLMSQL directory
        • func start
        • Open Postman and create a new post request to the URL displayed in the terminal
          • Ensure that the question to the database is included in the request body
          • Request body example: {"question":"Which customers live in Paris?"}
          • Ensure to include function key as x-functions-key in the request header.
      • To deploy to Azure Function App
        • Create Azure Function App (Python 3.9)
        • Open VS Code terminal and navigate to \miscellaneous\Azure Functions\Proj_LLMSQL directory
        • az login
        • az account set --subscription [Your Subscription ID]
        • func azure functionapp publish [Your function app name]
        • Update application settings.
    • /AzureFunctions/GPT4V_Custom_Skill:

      • This Azure Function project contains function that accepts image as input (base64-encoded string format), passes it to GPT4-Vision endpoint and returns a text description based on the configuration parameters (system message, temperature, max response, etc.). The http trigger function is built to be called as Azure AI Search custom skill.
      • You require Azure OpenAI service with GPT4 Vision deployment, existing/new Azure AI Search service provisioned.
      • You will need to create additional AI Search artefacts to consume this functionality: Data source, Skillset, Indexer.
      • Please refer to information given on passing images to AI Search custom skills
      • Please review the public documentation to learn more about the GPT-4 Turbo with Vision
      • The provided Azure function code is tested for Python 3.10
      • Steps to run locally and deploy to Azure are same, listed in previous function section.

  • /SynapseML: This directory contains Synapse ML project artefacts.

    • /SynapseML/ChatCompletion:

      • This is collection of artefacts to demonstrate the ChatCompletion using Azure OpenAI gpt-35-turbo (0613) on a spark dataframe in Synapse.

      • The code uses SynapseML library.

      • Setup instructions for SynapseML library and Spark MS SQL Connector have been given in the respective notebooks.

      • The sample dataset is a subset of OpinRank dataset (Hotel and car reviews) which is included in this repo.

      • You need to upload the provided dataset into your Synapse workspace (or a storage account connected to the workspace).

      • Ensure that Azure OpenAI service is provisioned and gpt-35-turbo (0613) model deployment is done beforehand.

      • Ensure you have a Linked Service created for an Azure Key Vault where the secrets have to be stored (AZSQLUSR, AZSQLPWD, OPENAISERVICENAME,OPENAIAPIKEY). Alternatively, you can provide them in code directly (not recommended).

      • Import both notebooks into your Synapse workspace.

        • First notebook imports provided dataset into an Azure SQL table.
        • Second notebook reads data from Azure SQL table, applies custom LLM prompt to each record of the dataframe's nominated column using ChatCompletion model.
      • Input dataset after import to Azure SQL server table, after successful run of the first notebook.
        (Query run in SQL Server Management Studio)

      • ChatCompletion System Message

        You are the AI analyst and lead the change management team in hospitality industry.
        
      • ChatCompletion Prompt

        Perform a thorough analysis of the comments written by our guests.
        
        If comment has a positive sentiment, identify what we need to be keep doing.
        If comment has a negative sentiment, identify what we need to improve.
        For neutral sentiment comment, say "N/A"
        
        Format your findings as per structure below:
        {
            "sentiment":"positive" / "negative" / "neutral",
            "identified_action": "",
            "action_type": "continue" / "improve" / "N/A"
        }
        
        Only return one finding per review comment.
        
        Provided review comments:
        
      • Output dataset created with SynapseML, saved into Azure SQL Server table after successful run of the second notebook. (Query run in SQL Server Management Studio)

      • The dataset / LLM prompt can be adjusted as per your requirement once you get this solution working.

      • The LLM prompt produces output in the JSON format. The T-SQL SELECT query to parse the output is also included in this repo.

  • /AzureML/AutoML BatchEndpoints: This directory contains notebooks to train AutoML model and deploy it as a batch endpoint.

  • /LargeDocumentQA: This directory contains demo python code to extract the text from small / large PDF documents, chunk and vectorize into AI Search Index for Document QA task.

About

Miscellaneous code samples

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors