3. Interacting with LangCache: Basic Operations
Now that you understand the core concepts of semantic caching, let’s dive into the practical aspects of interacting with Redis LangCache. This chapter focuses on the most common operations: storing responses and searching for them, providing detailed examples in both Node.js and Python.
3.1 Initialization and Authentication
Before performing any operations, you need to initialize the LangCache client with your service credentials. These credentials (API Host, Cache ID, API Key) should be loaded from your .env file, as set up in Chapter 1.
Python Initialization
# python-examples/basic_operations.py
import os
from dotenv import load_dotenv
from langcache import LangCache
load_dotenv()
LANGCACHE_API_HOST = os.getenv("LANGCACHE_API_HOST")
LANGCACHE_CACHE_ID = os.getenv("LANGCACHE_CACHE_ID")
LANGCACHE_API_KEY = os.getenv("LANGCACHE_API_KEY")
lang_cache = LangCache(
server_url=f"https://{LANGCACHE_API_HOST}",
cache_id=LANGCACHE_CACHE_ID,
api_key=LANGCACHE_API_KEY
)
print("LangCache client initialized (Python).")
Node.js Initialization
// nodejs-examples/basic_operations.js
require('dotenv').config({ path: '../.env' });
const { LangCache } = require('@redis-ai/langcache');
const LANGCACHE_API_HOST = process.env.LANGCACHE_API_HOST;
const LANGCACHE_CACHE_ID = process.env.LANGCACHE_CACHE_ID;
const LANGCACHE_API_KEY = process.env.LANGCACHE_API_KEY;
const langCache = new LangCache({
serverURL: `https://${LANGCACHE_API_HOST}`,
cacheId: LANGCACHE_CACHE_ID,
apiKey: LANGCACHE_API_KEY,
});
console.log("LangCache client initialized (Node.js).");
3.2 Storing a Prompt and Response (set method)
The set method is used to store a new prompt and its corresponding LLM response in LangCache. When you call set, LangCache automatically generates an embedding for the prompt and stores it alongside the response.
Python set Examples
# In python-examples/basic_operations.py (add inside an async function, e.g., main)
async def store_examples():
print("\n--- Storing Examples (Python) ---")
# Example 1: Basic storage
prompt1 = "What are the benefits of cloud computing?"
response1 = "Cloud computing offers scalability, cost-effectiveness, flexibility, and enhanced security compared to traditional on-premise solutions."
print(f"Storing: '{prompt1}' -> '{response1[:50]}...'")
key1 = await lang_cache.set(prompt=prompt1, response=response1)
print(f"Stored entry with key: {key1}")
# Example 2: Storing with metadata
prompt2 = "Explain the concept of containerization."
response2 = "Containerization is a virtualization method that allows you to package an application with all its dependencies into a single, isolated unit called a container. This ensures consistency across different environments."
metadata2 = {"topic": "DevOps", "source_llm": "ChatGPT-4", "version": "1.0"}
print(f"Storing: '{prompt2}' with metadata {metadata2}")
key2 = await lang_cache.set(prompt=prompt2, response=response2, metadata=metadata2)
print(f"Stored entry with key: {key2}")
# Example 3: Storing with a TTL (Time-To-Live in seconds)
# This entry will expire after 60 seconds
prompt3 = "What is the current temperature in London?"
response3 = "The current temperature in London is 15°C and partly cloudy."
ttl_seconds = 60
print(f"Storing: '{prompt3}' with TTL of {ttl_seconds} seconds")
key3 = await lang_cache.set(prompt=prompt3, response=response3, ttl=ttl_seconds)
print(f"Stored entry with key: {key3}. It will expire soon.")
# Always allow some time for indexing
await asyncio.sleep(2)
print("Storage examples complete.")
# In main():
# await store_examples()
Node.js set Examples
// In nodejs-examples/basic_operations.js (add inside an async function, e.g., main)
async function storeExamples() {
console.log("\n--- Storing Examples (Node.js) ---");
// Example 1: Basic storage
const prompt1 = "What are the benefits of cloud computing?";
const response1 = "Cloud computing offers scalability, cost-effectiveness, flexibility, and enhanced security compared to traditional on-premise solutions.";
console.log(`Storing: '${prompt1}' -> '${response1.substring(0, 50)}...'`);
const result1 = await langCache.set({ prompt: prompt1, response: response1 });
console.log(`Stored entry with entryId: ${result1.entryId}`);
// Example 2: Storing with attributes (metadata)
const prompt2 = "Explain the concept of containerization.";
const response2 = "Containerization is a virtualization method that allows you to package an application with all its dependencies into a single, isolated unit called a container. This ensures consistency across different environments.";
const attributes2 = { topic: "DevOps", source_llm: "Gemini Pro", version: "1.0" };
console.log(`Storing: '${prompt2}' with attributes ${JSON.stringify(attributes2)}`);
const result2 = await langCache.set({ prompt: prompt2, response: response2, attributes: attributes2 });
console.log(`Stored entry with entryId: ${result2.entryId}`);
// Example 3: Storing with a TTL (Time-To-Live in seconds)
// This entry will expire after 60 seconds
const prompt3 = "What is the current temperature in London?";
const response3 = "The current temperature in London is 15°C and partly cloudy.";
const ttlSeconds = 60;
console.log(`Storing: '${prompt3}' with TTL of ${ttlSeconds} seconds`);
const result3 = await langCache.set({ prompt: prompt3, response: response3, ttl: ttlSeconds });
console.log(`Stored entry with entryId: ${result3.entryId}. It will expire soon.`);
// Always allow some time for indexing
await new Promise(resolve => setTimeout(resolve, 2000));
console.log("Storage examples complete.");
}
// In main():
// await storeExamples();
3.3 Searching the Cache (search method)
The search method is your primary way to query LangCache. You provide a prompt, and LangCache returns semantically similar cached responses, if any, along with their similarity scores.
Python search Examples
# In python-examples/basic_operations.py (add inside an async function, e.g., main)
async def search_examples():
print("\n--- Searching Examples (Python) ---")
# Ensure some data is in the cache from previous step or manual insertion
await lang_cache.set(prompt="What is the capital of Germany?", response="Berlin")
await lang_cache.set(prompt="How does machine learning work?", response="Machine learning involves training algorithms on data to identify patterns and make predictions.")
await asyncio.sleep(2) # Give it time to index
# Example 1: Basic search
search_prompt1 = "Capital of Germany?"
print(f"Searching for: '{search_prompt1}'")
results1 = await lang_cache.search(prompt=search_prompt1)
if results1:
print(f"Cache Hit! Response: '{results1[0].response[:50]}...', Score: {results1[0].score:.4f}")
else:
print("Cache Miss.")
# Example 2: Search with a lower similarity threshold (to potentially find more matches)
search_prompt2 = "Tell me about learning with machines."
print(f"\nSearching for: '{search_prompt2}' with threshold 0.7")
results2 = await lang_cache.search(prompt=search_prompt2, similarity_threshold=0.7)
if results2:
for i, res in enumerate(results2):
print(f"Result {i+1}: Response: '{res.response[:50]}...', Score: {res.score:.4f}")
else:
print("Cache Miss.")
# Example 3: Search with `num_results` to get multiple similar entries
search_prompt3 = "What are some coding best practices?"
await lang_cache.set(prompt="Write clean, readable code with meaningful variable names.", response="Clean code is crucial.")
await lang_cache.set(prompt="Use version control like Git for collaboration and tracking changes.", response="Git is essential.")
await lang_cache.set(prompt="Implement robust error handling and logging in your applications.", response="Error handling is key.")
await asyncio.sleep(2) # Give it time to index
print(f"\nSearching for: '{search_prompt3}' (top 2 results)")
results3 = await lang_cache.search(prompt=search_prompt3, num_results=2)
if results3:
for i, res in enumerate(results3):
print(f"Result {i+1}: Response: '{res.response[:50]}...', Score: {res.score:.4f}")
else:
print("Cache Miss.")
print("\nSearch examples complete.")
# In main():
# await search_examples()
Node.js search Examples
// In nodejs-examples/basic_operations.js (add inside an async function, e.g., main)
async function searchExamples() {
console.log("\n--- Searching Examples (Node.js) ---");
// Ensure some data is in the cache from previous step or manual insertion
await langCache.set({ prompt: "What is the capital of Germany?", response: "Berlin" });
await langCache.set({ prompt: "How does machine learning work?", response: "Machine learning involves training algorithms on data to identify patterns and make predictions." });
await new Promise(resolve => setTimeout(resolve, 2000)); // Give it time to index
// Example 1: Basic search
const searchPrompt1 = "Capital of Germany?";
console.log(`Searching for: '${searchPrompt1}'`);
const results1 = await langCache.search({ prompt: searchPrompt1 });
if (results1 && results1.results.length > 0) {
const firstResult = results1.results[0];
console.log(`Cache Hit! Response: '${firstResult.response.substring(0, 50)}...', Score: ${firstResult.score.toFixed(4)}`);
} else {
console.log("Cache Miss.");
}
// Example 2: Search with a lower similarity threshold
const searchPrompt2 = "Tell me about learning with machines.";
console.log(`\nSearching for: '${searchPrompt2}' with threshold 0.7`);
const results2 = await langCache.search({ prompt: searchPrompt2, similarityThreshold: 0.7 });
if (results2 && results2.results.length > 0) {
results2.results.forEach((res, i) => {
console.log(`Result ${i+1}: Response: '${res.response.substring(0, 50)}...', Score: ${res.score.toFixed(4)}`);
});
} else {
console.log("Cache Miss.");
}
# Example 3: Search with `numResults` to get multiple similar entries
const searchPrompt3 = "What are some coding best practices?";
await langCache.set({ prompt: "Write clean, readable code with meaningful variable names.", response: "Clean code is crucial." });
await langCache.set({ prompt: "Use version control like Git for collaboration and tracking changes.", response: "Git is essential." });
await langCache.set({ prompt: "Implement robust error handling and logging in your applications.", response: "Error handling is key." });
await new Promise(resolve => setTimeout(resolve, 2000)); // Give it time to index
console.log(`\nSearching for: '${searchPrompt3}' (top 2 results)`);
const results3 = await langCache.search({ prompt: searchPrompt3, numResults: 2 });
if (results3 && results3.results.length > 0) {
results3.results.forEach((res, i) => {
console.log(`Result ${i+1}: Response: '${res.response.substring(0, 50)}...', Score: ${res.score.toFixed(4)}`);
});
} else {
console.log("Cache Miss.");
}
console.log("\nSearch examples complete.");
}
// In main():
// await searchExamples();
3.4 Deleting Entries (delete_by_id / deleteById, delete_query / deleteQuery)
You can remove entries from the cache by their specific entryId (returned by the set method) or by providing attributes to delete all matching entries.
Python delete Examples
# In python-examples/basic_operations.py (add inside an async function, e.g., main)
async def delete_examples():
print("\n--- Deletion Examples (Python) ---")
# Store an entry to delete by ID
prompt_to_delete_id = "Temporary data point."
response_to_delete_id = "This will be deleted."
key_to_delete = await lang_cache.set(prompt=prompt_to_delete_id, response=response_to_delete_id)
print(f"Stored entry for deletion by ID: {key_to_delete}")
await asyncio.sleep(1)
# Delete by ID
print(f"Deleting entry with ID: {key_to_delete}")
await lang_cache.delete_by_id(entry_id=key_to_delete)
print("Deletion by ID complete. Verifying search...")
results_after_delete = await lang_cache.search(prompt="Temporary data point.")
if not results_after_delete:
print("Entry successfully deleted (search yielded no results).")
else:
print("Entry not deleted or still found in search.")
# Store entries to delete by query (attributes)
await lang_cache.set(prompt="Product A feature 1", response="Feature 1 description.", metadata={"product": "A"})
await lang_cache.set(prompt="Product A feature 2", response="Feature 2 description.", metadata={"product": "A"})
await lang_cache.set(prompt="Product B feature 1", response="Feature 1 description.", metadata={"product": "B"})
print("\nStored entries for Product A and Product B.")
await asyncio.sleep(2)
# Delete by query (all entries for 'product': 'A')
print("Deleting all entries with attribute 'product': 'A'")
await lang_cache.delete_query(attributes={"product": "A"})
print("Deletion by query complete. Verifying search...")
results_prod_a = await lang_cache.search(prompt="Product A feature", attributes={"product": "A"})
if not results_prod_a:
print("Entries for Product A successfully deleted.")
else:
print(f"Entries for Product A still found: {len(results_prod_a)}")
print("\nDeletion examples complete.")
# In main():
# await delete_examples()
Node.js delete Examples
// In nodejs-examples/basic_operations.js (add inside an async function, e.g., main)
async function deleteExamples() {
console.log("\n--- Deletion Examples (Node.js) ---");
// Store an entry to delete by ID
const promptToDeleteId = "Temporary data point.";
const responseToDeleteId = "This will be deleted.";
const storeResultDelete = await langCache.set({ prompt: promptToDeleteId, response: responseToDeleteId });
const entryIdToDelete = storeResultDelete.entryId;
console.log(`Stored entry for deletion by ID: ${entryIdToDelete}`);
await new Promise(resolve => setTimeout(resolve, 1000));
// Delete by ID
console.log(`Deleting entry with ID: ${entryIdToDelete}`);
await langCache.deleteById({ entryId: entryIdToDelete });
console.log("Deletion by ID complete. Verifying search...");
const resultsAfterDelete = await langCache.search({ prompt: "Temporary data point." });
if (!resultsAfterDelete || resultsAfterDelete.results.length === 0) {
console.log("Entry successfully deleted (search yielded no results).");
} else {
console.log("Entry not deleted or still found in search.");
}
// Store entries to delete by query (attributes)
await langCache.set({ prompt: "Product A feature 1", response: "Feature 1 description.", attributes: { product: "A" } });
await langCache.set({ prompt: "Product A feature 2", response: "Feature 2 description.", attributes: { product: "A" } });
await langCache.set({ prompt: "Product B feature 1", response: "Feature 1 description.", attributes: { product: "B" } });
console.log("\nStored entries for Product A and Product B.");
await new Promise(resolve => setTimeout(resolve, 2000));
// Delete by query (all entries for 'product': 'A')
console.log("Deleting all entries with attribute 'product': 'A'");
await langCache.deleteQuery({ attributes: { product: "A" } });
console.log("Deletion by query complete. Verifying search...");
const resultsProdA = await langCache.search({ prompt: "Product A feature", attributes: { product: "A" } });
if (!resultsProdA || resultsProdA.results.length === 0) {
console.log("Entries for Product A successfully deleted.");
} else {
console.log(`Entries for Product A still found: ${resultsProdA.results.length}`);
}
console.log("\nDeletion examples complete.");
}
// In main():
// await deleteExamples();
3.5 Clear All Entries (clear method)
The clear method removes all entries from your LangCache service. Use this with caution, especially in production environments!
Python clear Example
# In python-examples/basic_operations.py (add inside an async function, e.g., main)
async def clear_cache_example():
print("\n--- Clear Cache Example (Python) ---")
# Store some data first
await lang_cache.set(prompt="Clear test 1", response="Data to be cleared.")
await lang_cache.set(prompt="Clear test 2", response="More data to be cleared.")
print("Stored some entries for clearing.")
await asyncio.sleep(1)
print("Clearing all entries from LangCache...")
await lang_cache.clear()
print("Cache cleared. Verifying search...")
results_after_clear = await lang_cache.search(prompt="Clear test 1")
if not results_after_clear:
print("Cache successfully cleared (no results found).")
else:
print("Cache not cleared or still found results.")
print("\nClear cache example complete.")
# In main():
# await clear_cache_example()
Node.js clear Example
// In nodejs-examples/basic_operations.js (add inside an async function, e.g., main)
async function clearCacheExample() {
console.log("\n--- Clear Cache Example (Node.js) ---");
// Store some data first
await langCache.set({ prompt: "Clear test 1", response: "Data to be cleared." });
await langCache.set({ prompt: "Clear test 2", response: "More data to be cleared." });
console.log("Stored some entries for clearing.");
await new Promise(resolve => setTimeout(resolve, 1000));
console.log("Clearing all entries from LangCache...");
await langCache.clear();
console.log("Cache cleared. Verifying search...");
const resultsAfterClear = await langCache.search({ prompt: "Clear test 1" });
if (!resultsAfterClear || resultsAfterClear.results.length === 0) {
console.log("Cache successfully cleared (no results found).");
} else {
console.log("Cache not cleared or still found results.");
}
console.log("\nClear cache example complete.");
}
// In main():
// await clearCacheExample();
Combined main functions for basic_operations.py and basic_operations.js (uncomment to run all examples):
python-examples/basic_operations.py
import os
import asyncio
from dotenv import load_dotenv
from langcache import LangCache
load_dotenv()
LANGCACHE_API_HOST = os.getenv("LANGCACHE_API_HOST")
LANGCACHE_CACHE_ID = os.getenv("LANGCACHE_CACHE_ID")
LANGCACHE_API_KEY = os.getenv("LANGCACHE_API_KEY")
lang_cache = LangCache(
server_url=f"https://{LANGCACHE_API_HOST}",
cache_id=LANGCACHE_CACHE_ID,
api_key=LANGCACHE_API_KEY
)
print("LangCache client initialized (Python).")
async def store_examples():
print("\n--- Storing Examples (Python) ---")
prompt1 = "What are the benefits of cloud computing?"
response1 = "Cloud computing offers scalability, cost-effectiveness, flexibility, and enhanced security compared to traditional on-premise solutions."
print(f"Storing: '{prompt1}' -> '{response1[:50]}...'")
key1 = await lang_cache.set(prompt=prompt1, response=response1)
print(f"Stored entry with key: {key1}")
prompt2 = "Explain the concept of containerization."
response2 = "Containerization is a virtualization method that allows you to package an application with all its dependencies into a single, isolated unit called a container. This ensures consistency across different environments."
metadata2 = {"topic": "DevOps", "source_llm": "ChatGPT-4", "version": "1.0"}
print(f"Storing: '{prompt2}' with metadata {metadata2}")
key2 = await lang_cache.set(prompt=prompt2, response=response2, metadata=metadata2)
print(f"Stored entry with key: {key2}")
prompt3 = "What is the current temperature in London?"
response3 = "The current temperature in London is 15°C and partly cloudy."
ttl_seconds = 60
print(f"Storing: '{prompt3}' with TTL of {ttl_seconds} seconds")
key3 = await lang_cache.set(prompt=prompt3, response=response3, ttl=ttl_seconds)
print(f"Stored entry with key: {key3}. It will expire soon.")
await asyncio.sleep(2)
print("Storage examples complete.")
async def search_examples():
print("\n--- Searching Examples (Python) ---")
await lang_cache.set(prompt="What is the capital of Germany?", response="Berlin")
await lang_cache.set(prompt="How does machine learning work?", response="Machine learning involves training algorithms on data to identify patterns and make predictions.")
await asyncio.sleep(2)
search_prompt1 = "Capital of Germany?"
print(f"Searching for: '{search_prompt1}'")
results1 = await lang_cache.search(prompt=search_prompt1)
if results1:
print(f"Cache Hit! Response: '{results1[0].response[:50]}...', Score: {results1[0].score:.4f}")
else:
print("Cache Miss.")
search_prompt2 = "Tell me about learning with machines."
print(f"\nSearching for: '{search_prompt2}' with threshold 0.7")
results2 = await lang_cache.search(prompt=search_prompt2, similarity_threshold=0.7)
if results2:
for i, res in enumerate(results2):
print(f"Result {i+1}: Response: '{res.response[:50]}...', Score: {res.score:.4f}")
else:
print("Cache Miss.")
search_prompt3 = "What are some coding best practices?"
await lang_cache.set(prompt="Write clean, readable code with meaningful variable names.", response="Clean code is crucial.")
await lang_cache.set(prompt="Use version control like Git for collaboration and tracking changes.", response="Git is essential.")
await lang_cache.set(prompt="Implement robust error handling and logging in your applications.", response="Error handling is key.")
await asyncio.sleep(2)
print(f"\nSearching for: '{search_prompt3}' (top 2 results)")
results3 = await lang_cache.search(prompt=search_prompt3, num_results=2)
if results3:
for i, res in enumerate(results3):
print(f"Result {i+1}: Response: '{res.response[:50]}...', Score: {res.score:.4f}")
else:
print("Cache Miss.")
print("\nSearch examples complete.")
async def delete_examples():
print("\n--- Deletion Examples (Python) ---")
prompt_to_delete_id = "Temporary data point."
response_to_delete_id = "This will be deleted."
key_to_delete = await lang_cache.set(prompt=prompt_to_delete_id, response=response_to_delete_id)
print(f"Stored entry for deletion by ID: {key_to_delete}")
await asyncio.sleep(1)
print(f"Deleting entry with ID: {key_to_delete}")
await lang_cache.delete_by_id(entry_id=key_to_delete)
print("Deletion by ID complete. Verifying search...")
results_after_delete = await lang_cache.search(prompt="Temporary data point.")
if not results_after_delete:
print("Entry successfully deleted (search yielded no results).")
else:
print("Entry not deleted or still found in search.")
await lang_cache.set(prompt="Product A feature 1", response="Feature 1 description.", metadata={"product": "A"})
await lang_cache.set(prompt="Product A feature 2", response="Feature 2 description.", metadata={"product": "A"})
await lang_cache.set(prompt="Product B feature 1", response="Feature 1 description.", metadata={"product": "B"})
print("\nStored entries for Product A and Product B.")
await asyncio.sleep(2)
print("Deleting all entries with attribute 'product': 'A'")
await lang_cache.delete_query(attributes={"product": "A"})
print("Deletion by query complete. Verifying search...")
results_prod_a = await lang_cache.search(prompt="Product A feature", attributes={"product": "A"})
if not results_prod_a:
print("Entries for Product A successfully deleted.")
else:
print(f"Entries for Product A still found: {len(results_prod_a)}")
print("\nDeletion examples complete.")
async def clear_cache_example():
print("\n--- Clear Cache Example (Python) ---")
await lang_cache.set(prompt="Clear test 1", response="Data to be cleared.")
await lang_cache.set(prompt="Clear test 2", response="More data to be cleared.")
print("Stored some entries for clearing.")
await asyncio.sleep(1)
print("Clearing all entries from LangCache...")
await lang_cache.clear()
print("Cache cleared. Verifying search...")
results_after_clear = await lang_cache.search(prompt="Clear test 1")
if not results_after_clear:
print("Cache successfully cleared (no results found).")
else:
print("Cache not cleared or still found results.")
print("\nClear cache example complete.")
async def main():
await store_examples()
await search_examples()
await delete_examples()
await clear_cache_example() # Use with caution!
if __name__ == "__main__":
asyncio.run(main())
nodejs-examples/basic_operations.js
require('dotenv').config({ path: '../.env' });
const { LangCache } = require('@redis-ai/langcache');
const LANGCACHE_API_HOST = process.env.LANGCACHE_API_HOST;
const LANGCACHE_CACHE_ID = process.env.LANGCACHE_CACHE_ID;
const LANGCACHE_API_KEY = process.env.LANGCACHE_API_KEY;
const langCache = new LangCache({
serverURL: `https://${LANGCACHE_API_HOST}`,
cacheId: LANGCACHE_CACHE_ID,
apiKey: LANGCACHE_API_KEY,
});
console.log("LangCache client initialized (Node.js).");
async function storeExamples() {
console.log("\n--- Storing Examples (Node.js) ---");
const prompt1 = "What are the benefits of cloud computing?";
const response1 = "Cloud computing offers scalability, cost-effectiveness, flexibility, and enhanced security compared to traditional on-premise solutions.";
console.log(`Storing: '${prompt1}' -> '${response1.substring(0, 50)}...'`);
const result1 = await langCache.set({ prompt: prompt1, response: response1 });
console.log(`Stored entry with entryId: ${result1.entryId}`);
const prompt2 = "Explain the concept of containerization.";
const response2 = "Containerization is a virtualization method that allows you to package an application with all its dependencies into a single, isolated unit called a container. This ensures consistency across different environments.";
const attributes2 = { topic: "DevOps", source_llm: "Gemini Pro", version: "1.0" };
console.log(`Storing: '${prompt2}' with attributes ${JSON.stringify(attributes2)}`);
const result2 = await langCache.set({ prompt: prompt2, response: response2, attributes: attributes2 });
console.log(`Stored entry with entryId: ${result2.entryId}`);
const prompt3 = "What is the current temperature in London?";
const response3 = "The current temperature in London is 15°C and partly cloudy.";
const ttlSeconds = 60;
console.log(`Storing: '${prompt3}' with TTL of ${ttlSeconds} seconds`);
const result3 = await langCache.set({ prompt: prompt3, response: response3, ttl: ttlSeconds });
console.log(`Stored entry with entryId: ${result3.entryId}. It will expire soon.`);
await new Promise(resolve => setTimeout(resolve, 2000));
console.log("Storage examples complete.");
}
async function searchExamples() {
console.log("\n--- Searching Examples (Node.js) ---");
await langCache.set({ prompt: "What is the capital of Germany?", response: "Berlin" });
await langCache.set({ prompt: "How does machine learning work?", response: "Machine learning involves training algorithms on data to identify patterns and make predictions." });
await new Promise(resolve => setTimeout(resolve, 2000));
const searchPrompt1 = "Capital of Germany?";
console.log(`Searching for: '${searchPrompt1}'`);
const results1 = await langCache.search({ prompt: searchPrompt1 });
if (results1 && results1.results.length > 0) {
const firstResult = results1.results[0];
console.log(`Cache Hit! Response: '${firstResult.response.substring(0, 50)}...', Score: ${firstResult.score.toFixed(4)}`);
} else {
console.log("Cache Miss.");
}
const searchPrompt2 = "Tell me about learning with machines.";
console.log(`\nSearching for: '${searchPrompt2}' with threshold 0.7`);
const results2 = await langCache.search({ prompt: searchPrompt2, similarityThreshold: 0.7 });
if (results2 && results2.results.length > 0) {
results2.results.forEach((res, i) => {
console.log(`Result ${i+1}: Response: '${res.response.substring(0, 50)}...', Score: ${res.score.toFixed(4)}`);
});
} else {
console.log("Cache Miss.");
}
const searchPrompt3 = "What are some coding best practices?";
await langCache.set({ prompt: "Write clean, readable code with meaningful variable names.", response: "Clean code is crucial." });
await langCache.set({ prompt: "Use version control like Git for collaboration and tracking changes.", response: "Git is essential." });
await langCache.set({ prompt: "Implement robust error handling and logging in your applications.", response: "Error handling is key." });
await new Promise(resolve => setTimeout(resolve, 2000));
console.log(`\nSearching for: '${searchPrompt3}' (top 2 results)`);
const results3 = await langCache.search({ prompt: searchPrompt3, numResults: 2 });
if (results3 && results3.results.length > 0) {
results3.results.forEach((res, i) => {
console.log(`Result ${i+1}: Response: '${res.response.substring(0, 50)}...', Score: ${res.score.toFixed(4)}`);
});
} else {
console.log("Cache Miss.");
}
console.log("\nSearch examples complete.");
}
async function deleteExamples() {
console.log("\n--- Deletion Examples (Node.js) ---");
const promptToDeleteId = "Temporary data point for ID deletion.";
const responseToDeleteId = "This will be deleted by ID.";
const storeResultDelete = await langCache.set({ prompt: promptToDeleteId, response: responseToDeleteId });
const entryIdToDelete = storeResultDelete.entryId;
console.log(`Stored entry for deletion by ID: ${entryIdToDelete}`);
await new Promise(resolve => setTimeout(resolve, 1000));
console.log(`Deleting entry with ID: ${entryIdToDelete}`);
await langCache.deleteById({ entryId: entryIdToDelete });
console.log("Deletion by ID complete. Verifying search...");
const resultsAfterDelete = await langCache.search({ prompt: "Temporary data point for ID deletion." });
if (!resultsAfterDelete || resultsAfterDelete.results.length === 0) {
console.log("Entry successfully deleted (search yielded no results).");
} else {
console.log("Entry not deleted or still found in search.");
}
await langCache.set({ prompt: "Product X feature overview", response: "Feature A, B, C.", attributes: { product: "X" } });
await langCache.set({ prompt: "Product X pricing details", response: "Starts at $99/month.", attributes: { product: "X" } });
await langCache.set({ prompt: "Product Y introduction", response: "New product with great features.", attributes: { product: "Y" } });
console.log("\nStored entries for Product X and Product Y.");
await new Promise(resolve => setTimeout(resolve, 2000));
console.log("Deleting all entries with attribute 'product': 'X'");
await langCache.deleteQuery({ attributes: { product: "X" } });
console.log("Deletion by query complete. Verifying search...");
const resultsProdX = await langCache.search({ prompt: "Product X details", attributes: { product: "X" } });
if (!resultsProdX || resultsProdX.results.length === 0) {
console.log("Entries for Product X successfully deleted.");
} else {
console.log(`Entries for Product X still found: ${resultsProdX.results.length}`);
}
console.log("\nDeletion examples complete.");
}
async function clearCacheExample() {
console.log("\n--- Clear Cache Example (Node.js) ---");
await langCache.set({ prompt: "Clear test 1", response: "Data to be cleared." });
await langCache.set({ prompt: "Clear test 2", response: "More data to be cleared." });
console.log("Stored some entries for clearing.");
await new Promise(resolve => setTimeout(resolve, 1000));
console.log("Clearing all entries from LangCache...");
await langCache.clear();
console.log("Cache cleared. Verifying search...");
const resultsAfterClear = await langCache.search({ prompt: "Clear test 1" });
if (!resultsAfterClear || resultsAfterClear.results.length === 0) {
console.log("Cache successfully cleared (no results found).");
} else {
console.log("Cache not cleared or still found results.");
}
console.log("\nClear cache example complete.");
}
async function main() {
await storeExamples();
await searchExamples();
await deleteExamples();
await clearCacheExample(); // Use with caution!
}
main().catch(console.error);
Exercise/Mini-Challenge: Build a FAQ Caching System
Objective: Create a simple FAQ system where questions and answers are stored in LangCache. Implement functionalities to add new FAQs, search for answers, and remove specific FAQs.
Instructions:
- Create a new file (e.g.,
faq_cache.pyorfaq_cache.js). - Initialize the LangCache client as shown in the examples.
- Implement the following functions:
add_faq(question, answer, category): Stores a new question-answer pair along with acategoryattribute (e.g., “General”, “Billing”, “Technical”). Print theentryId.get_answer(query, category=None): Searches LangCache for an answer to thequery. If acategoryis provided, it should narrow the search to that category using attributes. Print the answer and similarity score if found, otherwise indicate a cache miss.remove_faq_by_id(entry_id): Deletes an FAQ entry using itsentryId.remove_faqs_by_category(category): Deletes all FAQs belonging to a specificcategory.
- Write a simple script that demonstrates these functions:
- Add at least 5 different FAQs across 2-3 categories.
- Perform searches for various questions, some with categories, some without, to observe how the cache responds.
- Remove one FAQ by ID.
- Remove all FAQs from one category.
- Attempt to search for the removed FAQs to confirm deletion.
This exercise will give you hands-on experience with managing cache entries and leveraging attributes for more targeted caching.