Harnessing Semantic Kernel for LLM Integration
What you will learn
- What is the Semantic Kernel?
- The Semantic Kernel is an innovative open-source SDK designed to bridge the gap between AI functionalities and existing applications, supporting integration with various AI models from providers like OpenAI, Azure OpenAI, and Hugging Face.
- Where can the code for the example project on Semantic Kernel be found?
- All of the code for the example project on Semantic Kernel can be found in the companion GitHub repository at https://github.com/stephenc222/example-semantic-kernel.
- What does the JobSearchPlugin do in the context of Semantic Kernel?
- The JobSearchPlugin utilizes the Google Jobs API to conduct job searches based on user queries, demonstrating how the Semantic Kernel can be employed in diverse scenarios and integrate with external services to extract real-time data.
- What are the key features of the JobSearchPlugin?
- The key features of the JobSearchPlugin include API integration with Google Jobs API, flexible authentication, Semantic Kernel function annotations (@sk_function and @sk_function_context_parameter), and real-world data processing.
- How does Semantic Kernel streamline the integration of AI functionalities into applications?
- Semantic Kernel streamlines the integration of AI functionalities into applications by providing a robust platform that allows easy integration of existing code into AI agents through plugins, and interfaces that enable integration with any AI service. This reduces the amount of code required and offers versatile components for complex functionalities.
In the rapidly evolving digital era, the integration of LLM (large language model) functionalities into applications is becoming increasingly crucial. The Semantic Kernel emerges as a game-changer in this domain, enabling developers to seamlessly blend LLM models with existing codebases. This blog post focuses on an example project that illustrates the power and versatility of Semantic Kernel, particularly highlighting its capability to automate tasks and enhance user experiences.
All of the code for this blog post can be found in the companion GitHub repository.
Understanding Semantic Kernel
The Semantic Kernel is an innovative open-source SDK designed to bridge the gap between AI functionalities and existing applications. It supports integration with various AI models from prominent providers like OpenAI, Azure OpenAI, and Hugging Face. The key goal of Semantic Kernel is to empower developers to create intelligent agents that can automate processes and effectively respond to user interactions.
Inside the Example Project
Our example project demonstrates the practical application of Semantic Kernel using the JobSearchPlugin. This example serves as an illustration for how the Semantic Kernel can be employed in diverse scenarios.
Project Setup
Setting up the project involves initializing the Semantic Kernel and importing the JobSearchPlugin
plugin class. The app.py
script demonstrates this process, illustrating the simplicity and user-friendliness of the Semantic Kernel.
The JobSearch Plugin
A core part of our example project is the JobSearch Plugin. This plugin utilizes the Google Jobs API to conduct job searches based on user queries. The search_jobs
method in the plugin is annotated with sk_function
and sk_function_context_parameter
decorators. These annotations are crucial as they provide the Semantic Kernel with necessary metadata, enabling accurate interpretation and execution of the function.
import os
from serpapi import search as GoogleSearch
from semantic_kernel.skill_definition import (
sk_function,
sk_function_context_parameter,
)
class JobSearchPlugin:
# optionally pass the api_key directly or set it in the environment variables
def __init__(self, api_key: str = None):
# get environment variable SERPAPI_API_KEY if not passed
self.api_key = api_key or os.getenv("SERPAPI_API_KEY")
@sk_function(
description="Searches for jobs using Google Jobs API",
name="SearchJobs",
)
@sk_function_context_parameter(
name="query",
description="The input search query, e.g., 'barista new york'",
)
def search_jobs(self, query: str) -> str:
params = {
"engine": "google_jobs",
"q": query,
"hl": "en",
"api_key": self.api_key
}
search = GoogleSearch(params)
jobs_results = search.get("jobs_results", [])
if not jobs_results:
return "No job results found."
# Process and format the results for output
formatted_results = [
f"{job['title']} at {job['company_name']}" for job in jobs_results]
return '\n'.join(formatted_results)
Key Features of the JobSearch Plugin
-
API Integration: The plugin integrates with the Google Jobs API, demonstrating how Semantic Kernel can connect with external services to extract real-time data.
-
Flexibility in Authentication: The constructor of the JobSearchPlugin class (
__init__
) is designed to accept an API key directly or fetch it from environment variables, offering flexibility in handling credentials securely. -
Semantic Kernel Function Annotations: The use of
@sk_function
and@sk_function_context_parameter
decorators is crucial. These annotations inform the Semantic Kernel how to interact with the function, including its purpose (description
) and expected input (name
anddescription
of the context parameter). -
Real-World Data Processing: The
search_jobs
function takes a user query and processes it through the Google Jobs API. It then formats the job search results into a user-friendly string format, demonstrating how to handle and present API data effectively.
Importance in the Project
The JobSearch Plugin is a practical example of how Semantic Kernel can be used to create powerful AI-driven applications. It illustrates the SDK’s ability to handle external API calls, process data, and return meaningful results. This plugin not only adds significant value to the example project but also serves as a blueprint for developers looking to integrate similar functionalities into their applications.
Through this plugin, we can see the real-world impact of Semantic Kernel, bridging the gap between AI capabilities and practical application needs. Whether it’s for job searches, data analysis, or other API-driven tasks, the JobSearch Plugin exemplifies the versatility and power of Semantic Kernel in modern software development.
The app.py
Our app.py
is the root of our project, that we will execute directly with the python interpreter.
Imports of app.py
The imports section of our app.py
sets the stage for utilizing Semantic Kernel’s capabilities. Here’s a breakdown:
from dotenv import load_dotenv
import semantic_kernel as sk
from plugins.JobSearch import JobSearchPlugin
load_dotenv()
In this section, we import load_dotenv
from the dotenv
package to manage environment variables, crucial for handling API keys and other sensitive data securely. The semantic_kernel as sk
import brings in the Semantic Kernel SDK, which is the backbone of our application, enabling the integration of AI functionalities. Finally, from plugins.JobSearch import JobSearchPlugin
imports the JobSearchPlugin we developed, allowing us to utilize its job searching capabilities.
The main function of app.py
The app.py
script is the core of our project, where the magic of Semantic Kernel comes to life. Here’s an explanation:
async def main():
# Initialize the kernel
kernel = sk.Kernel()
# Import the JobSearchPlugin
jobs_plugin = kernel.import_skill(
JobSearchPlugin(), skill_name="JobSearchPlugin")
# Run the SearchJobs function with the context
result = await kernel.run_async(
jobs_plugin["SearchJobs"],
input_str="software engineer, Spain",
)
print("Job Search Results:", result)
# Run the main function
if __name__ == "__main__":
import asyncio
asyncio.run(main())
In this script, we define an asynchronous main
function. We start by initializing the Semantic Kernel (kernel = sk.Kernel()
), which sets up our AI integration environment. Next, we import our JobSearchPlugin using kernel.import_skill
, making its functionalities accessible to the kernel. The key operation happens when we asynchronously run kernel.run_async
, executing the SearchJobs
function from our plugin with the input “software engineer, Spain”. This demonstrates how seamlessly we can invoke our AI-integrated functionalities. The script concludes with printing the job search results, showcasing the practical output of our integration.
Running app.py
is straightforward:
python3 app.py
This command activates the Semantic Kernel, utilizing the JobSearchPlugin to perform a job search operation, illustrating the SDK’s efficiency and practical application.
The Advantages of Semantic Kernel
Semantic Kernel stands out for several key reasons:
-
Extensibility: It allows easy integration of existing code into AI agents using plugins, thereby extending the capabilities of an AI application.
-
Flexibility: The platform’s interfaces enable easy integration with any AI service, allowing for the swapping of AI models as advancements are made.
-
Efficiency: Semantic Kernel reduces the amount of code required to integrate AI functionalities, streamlining the development process.
-
Versatility: It offers the flexibility to use its components separately or in combination for more complex functionalities.
Real-World Applications
Semantic Kernel has wide-ranging applications, from automating routine business tasks to conducting data analyses, responding to customer queries, and beyond. Its ability to integrate with different AI models and external APIs opens up a plethora of opportunities for developing sophisticated, intelligent applications.
Challenges and Key Considerations
While Semantic Kernel simplifies AI integration, developers must navigate the learning curve associated with its functionalities and manage the complexities of interactions between various plugins and AI models.
Conclusion
Semantic Kernel is at the forefront of AI development, offering a robust platform for integrating LLMs into applications. The JobSearchPlugin in this example project is a prime illustration of the SDK’s potential.
It’s not just a tool for developers; it’s a catalyst for innovation, enabling the creation of AI-enhanced applications that can transform industries and redefine user experiences. In the world of software development, the Semantic Kernel has the potential to rapidly become a standard for LLM integration, symbolizing a future where LLMs are seamlessly woven into the fabric of digital solutions.