Blenra LogoBlenra
Optimized for: Gemini / ChatGPT / Claude
#Observability

Deep JVM Monitoring for Microservices with Micrometer

Customize the variables below to instantly engineer your prompt.

Required Variables

jvm-monitoring-micrometer-prometheus.txt
Act as an Elite Java Performance Specialist. Architect a highly granular Grafana dashboard and specialized Prometheus scrape configuration explicitly tailored for a massive Spring Boot microservice running on [JVM_VERSION] utilizing the Micrometer registry. The observability suite must focus relentlessly on the specific behaviors of the [GC_TYPE] (e.g., G1GC or ZGC), active thread states, and precise memory pool consumption plotted aggressively against the hard [HEAP_LIMIT]. You must provide the exact, highly complex PromQL queries required to mathematically calculate 'Garbage Collection Overhead Percentage' (CPU time spent pausing) and 'Metaspace Utilization'. Detail exactly how to orchestrate these queries into a single, high-density Grafana pane of glass designed to instantly diagnose JVM memory leaks.

Example Text Output

"The response provides a specialized JVM dashboard schema and PromQL expressions for deep runtime analysis."

More Cloud & DevOps Prompts

View all →

Frequently Asked Questions

What is the "Deep JVM Monitoring for Microservices with Micrometer" prompt used for?

The response provides a specialized JVM dashboard schema and PromQL expressions for deep runtime analysis.

Which AI tools work with this prompt?

This prompt is optimized for Gemini / ChatGPT / Claude, but works great with ChatGPT, Claude, Gemini, and other large language models. Simply copy it and paste it into your preferred AI tool.

How do I customize this prompt?

Use the variable fields above to fill in your specific details. The prompt will auto-update as you type, ready to copy instantly.

Is this prompt free?

Yes! All prompts on Blenra are free to copy and use immediately. No account required.