Nika home page
Documentation
Guides
Website
Forum
Support
Go to App
Go to App
Search...
Search...
Navigation
Best Practices
Performance
Website
Forum
Support
Tutorials
Getting Started With NikaWorkspace
Building Your First App
Geospatial Analysis
Map Visualization Tutorial
Map Visualization
Create your First Map
Add Layers to Map
Vector Layer Styling
Raster Layer Styling
Multi-user Edits on the Same Map
Switch Base Map
Sync Multiple Maps
Set Initial View
Publish a Map (Team or Public)
Data Analysis
Create a Notebook to Start Analysis
Start Machine
Run the Code in Notebook
AI Coding in Notebook
GDAL and CUDA Installation
Supported Python Libraries
Publish a Notebook (Team or Public)
Best Practices
Best Practices
Performance
Security
On this page
Performance
Data Lakehouse Architecture
Unified Data Platform
Scalable Infrastructure
Performance Features
High-Performance Computing
Data Processing
Network Performance
Optimization Best Practices
Data Management
Resource Utilization
Code Optimization
Performance Monitoring
Real-Time Metrics
Optimization Tools
Best Practices
Performance
Copy page
Learn how to optimize performance in Nika platform
Copy page
​
Performance
Nika platform leverages advanced data lakehouse architecture to deliver exceptional performance across all services.
​
Data Lakehouse Architecture
​
Unified Data Platform
Single Architecture
: Notebooks, maps, storage, databases, and VMs all powered by one platform
Data Lakehouse
: Combines data lake flexibility with data warehouse performance
Unified Access
: Seamless data access across all Nika services
​
Scalable Infrastructure
100TB+ Data Hosting
: Standard capacity for any analysis workload
PB-Scale Enterprise
: Petabyte-scale support for large enterprise workloads
Auto-Scaling
: Dynamic resource allocation based on demand
​
Performance Features
​
High-Performance Computing
GPU Acceleration
: NVIDIA T4 and H100 GPU support for ML workloads
Multi-Core Processing
: Up to 30 CPU cores for parallel processing
Memory Optimization
: Up to 156GB RAM for large dataset handling
​
Data Processing
Streaming Execution
: Real-time data processing and analysis
Background Processing
: Long-running tasks continue when workspace is closed
Optimized Storage
: Efficient data formats and compression
​
Network Performance
High-Bandwidth
: Fast data transfer and access
Global CDN
: Content delivery network for worldwide access
Low Latency
: Minimal delay for interactive operations
​
Optimization Best Practices
​
Data Management
Efficient Formats
: Use optimized file formats (Parquet, COG)
Partitioning
: Implement data partitioning for faster queries
Caching
: Leverage built-in caching for repeated operations
​
Resource Utilization
Right-Sized VMs
: Choose appropriate machine configurations
Batch Processing
: Process data in manageable chunks
Memory Management
: Clean up large variables when done
​
Code Optimization
Vectorized Operations
: Use vectorized operations over loops
Parallel Processing
: Utilize multiple cores for computation
Efficient Libraries
: Use optimized libraries (NumPy, Pandas, GDAL)
​
Performance Monitoring
​
Real-Time Metrics
Resource Usage
: Monitor CPU, memory, and GPU utilization
Execution Time
: Track code execution performance
Data Throughput
: Measure data processing speeds
​
Optimization Tools
Built-in Profiling
: Performance analysis tools
Resource Monitoring
: Real-time resource tracking
Performance Alerts
: Automatic performance notifications
Last updated: August 2025
Best Practices
Security
Assistant
Responses are generated using AI and may contain mistakes.