At HiveGPT LLC, we have put technical and organizational measures in place that show our commitment to protecting the confidentiality, integrity, and availability of our information systems and our customer’s/users’ data. We continually enhance our security measures and evaluate their efficacy to give you confidence in the HiveGPT solution. This page provides an overview of the security measures we have put in place to protect your data. If you have any questions, you can reach our security team at privacy@hivegpt.ai.
Cloud Security
Data Center Physical Security
Facilities
HiveGPT uses Amazon AWS and Microsoft Azure for data center hosting. AWS and Azure data centers are certified as ISO 27001, PCI DSS Service Provider Level 1, and or SOC 1 and 2 compliant. Learn more about Compliance at AWS and Learn more about Compliance at Azure. AWS and Azure employ robust controls to secure the availability and security of their systems, including controls such as backup power, fire detection and suppression equipment, and secure device destruction. Learn more about Data Center Controls at AWS and Learn more about Data Center Controls at Azure
On-Site Security: AWS and Azure implement layered physical security controls to ensure on-site security, including vetted security guards, fencing, video monitoring, and intrusion detection technology. Learn more about AWS Physical Security and Learn more about Azure Physical Security
Network Security
HiveGPT has implemented industry-standard security controls to protect customer data from loss or unauthorized disclosure. and have created network boundary protection mechanisms for our production systems. Access to production systems containing customer information requires two-factor authentication through a VPN connection. Only authorized persons can access customer data. Developers and/or system admins are not authorized to access customer data, and all customer data is protected with strong encryption. HiveGPT’s service-level agreement (SLA) includes the specifics of data maintenance services provided and excluded, conditions of service availability, standards (such as time window for each level of service), responsibilities of each party, escalation procedures, and cost/service trade-offs.
In-House Security Team: HiveGPT has a dedicated and passionate global security team that responds to security alerts and events.
Third-Party Penetration Tests: Third-party penetration tests are conducted against the application and supporting infrastructure at least annually and as needed (in case of changes in the application and/or infrastructure). Any vulnerabilities found during penetration tests are tracked and remediated. These penetration test reports are available by request with a Non-Disclosure and Confidentiality Agreement (NDA) in place.
Threat Detection: HiveGPT leverages threat detection services within AWS and Azure to continuously monitor for malicious and unauthorized activity.
Vulnerability Scanning: HiveGPT performs regular internal vulnerability scans of infrastructure, and the identified issues are tracked and remediated.
DDoS Mitigation: HiveGPT uses DDoS protection strategies and tools layered to mitigate DDoS threats. We utilize AWS Shield to protect from DDoS attacks. HiveGPT has also implemented Akamai's Web Application Protector which provides automated web application firewall (WAF) and distributed denial-of-service (DDoS) protection. It helps in DDoS and application-layer security that simplifies the task of securing our web applications with automated updates to security protections. It automatically drops network-layer attacks at the edge and responds to application-layer attacks within seconds to minimize any downtime.
Encryption
All data is automatically encrypted and stored on the cloud. Azure Key Vault and AWS KMS are used to store tokens, passwords, certificates, API keys, and other secrets. Azure Key Vault and AWS KMS are used as a key management solution.
In-Transit: Communication with HiveGPT is encrypted via TLS 1.2 or higher over public networks. Data in transit via email is encrypted via S/MIME and TLS. We monitor community testing and research in this field and continue to follow best practices for Cipher adoption and TLS configuration.
At Rest: HiveGPT data at rest is encrypted with a symmetric encryption key and decrypted transparently with industry-standard AES-256 encryption. Data at rest can be protected using Customer keys or AWS keys. The same encryption key is used to decrypt the data as it is ready for use in memory. Keys are stored in a secure location with identity-based access control and audit policies.
Application Security
Quality Assurance: HiveGPT’s Quality Assurance (QA) team reviews and tests the code base on a per-pod basis. The security team has all the required tools and resources to investigate and recommend remediations and solutions for security vulnerabilities within code. Regular syncs, security resources, and training are provided to the QA team.
Environment Segregation: The testing, staging, and production environments are logically separated from one another. No customer data is used in any development or test environment.
Logs
HiveGPT preserves detailed log data, including audit logs, transaction logs, event logs, error logs, message logs, backup logs, network flow logs, and infrastructure change logs. Infrastructure / system-level logs for production environments are kept for one year. Testing and Staging environments have a lower retention policy. These logs are kept only for the purpose of, and as long as is necessary for, the Permitted Purpose, and are permanently deleted within 24 hours of the termination and/or expiration of the Agreement.
Personnel Security
HiveGPT only hires skilled professionals who successfully complete a background screening and sign a confidentiality agreement, acceptable use of information systems agreement, and code of conduct. Access control changes following personnel transfers are based on a Least Privilege model: Employees are given the minimum amount of access/permissions needed in order for them to perform their jobs.
Security Awareness
HiveGPT conducts a robust Security Awareness Training program, which is delivered within 30 days of hire and annually for all employees. In addition, we conduct quarterly focused training for key departments, including Secure Coding, Data Legislation, and Compliance.
Information Security Program
HiveGPT LLC. maintains an information security program focused on the security and integrity of Customer Data. Our comprehensive set of information security policies covers a range of topics. These policies are distributed to both employees and vendors and are accompanied by key policies such as Acceptable Use, Information Security, and an Employee Handbook. Our Information Security Program includes the adoption and enforcement of internal policies and procedures that are designed to:
HiveGPT’s information security program includes administrative, technical, and operational controls appropriate for the size of its business and the types of information it processes. We measure training effectiveness through quizzes and analyzing the trend of security incidents post-training, and re-train employees as needed.
Employee Background Checks
All HiveGPT employees undergo a background check prior to employment, which includes identity verification, qualification verification, criminal history, and employment verification.
Confidentiality Agreements
All employees are required to sign Non-Disclosure and Confidentiality Agreements (NDAs).
Access Controls
Access is limited using a Least Privilege model: Employees are given the minimum amount of access/permissions needed in order for them to perform their jobs. Access controls are frequently audited and are subject to technical enforcement and monitoring to ensure compliance. Two-Factor Authentication is required for all production systems. Access to systems and network resources is based on a documented, approved request process. Logical access to server platforms and management systems require two-factor authentication. Periodic checking ascertains whether the owner of a user ID is still employed and assigned to the appropriate role. Access is further limited by device permissions using a Least Privilege model, and all permissions require documented business needs. Exceptions found during the verification process are remediated and validation is conducted on a quarterly basis to determine whether access is commensurate with the user's job function. Exceptions found during the validation process are remediated. Business needs are validated on a quarterly basis to determine whether access levels are commensurate with the user's job function. Exceptions found during the validation period are remedied. User access is revoked upon termination of employment or change of job role. To authenticate to internal systems using AWS infrastructure, HiveGPT uses Identity and Access Management (IAM) to define and manage roles and access privileges. For customers, we use Customer Identity Management, and for employees, we use Employee Identity Management. For internal system access, we use Active Directory for centralized authentication and authorization. To authenticate to customer-facing systems, we use Customer Identity and Access Management (CIAM).
Third-Party Security
Vendor Management
HiveGPT understands the risks associated with improper vendor management. We evaluate and perform due diligence on all our vendors prior to engagement to ensure their security meets our standards. If they do not meet our requirements, we do not move forward with them. Selected vendors are monitored and reassessed on an ongoing basis, taking into account relevant changes.
Third-Party Sub Processors
HiveGPT uses third-party sub-processors to provide core infrastructure and services that support the application. Prior to engaging any third party, HiveGPT evaluates the vendor’s security as per our Vendor Management Policy.
Vendor (Sub-Processor): Amazon Web Services, Inc.
Relevant Services: Infrastructure Provider
Corporate Location: US (West 2); EU (Frankfurt)
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://d1.awsstatic.com/legal/aws-gdpr/AWS_GDPR_DPA.pdf
Vendor (Sub-Processor): Microsoft Azure
Relevant Services: Infrastructure Provider
Corporate Location: US (West 2); EU (West)
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://www.microsoftvolumelicensing.com/DocumentSearch.aspx?Mode=2&Keyword=DPA
Vendor (Sub-Processor): Akamai Technologies
Relevant Services: CDN, DoS protection and WAF
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://www.akamai.com/fr/fr/multimedia/documents/akamai/akamai-data-protection-addendum.pdf
Vendor (Sub-Processor): Datadog
Relevant Services: Log Aggregation & Analysis
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://www.datadoghq.com/legal/msa/
Vendor (Sub-Processor): Tray.io, Inc.
Relevant Services: Connect and automate the APIs
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: Available on Request
Vendor (Sub-Processor): Stripe
Relevant Services: Payment Processing
Corporate Location: US and EU
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: Available on Request
Vendor (Sub-Processor): Microsoft SQL Server
Relevant Services: Data Storage
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://www.microsoftvolumelicensing.com/DocumentSearch.aspx?Mode=2&Keyword=DPA
Vendor (Sub-Processor): Twilio, Inc.
Relevant Services: Provides features and services to allow communication with End Users via SMS, chat, and voice.
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://www.twilio.com/legal/data-protection-addendum
Vendor (Sub-Processor): Pendo.io, Inc.
Relevant Services: Provides an integrated platform for digital product teams in order to combine powerful product usage analytics with user guidance, communication, feedback, and planning tools.
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: Available on Request
Vendor (Sub-Processor): Zendesk, Inc.
Relevant Services: Provides customer relations management software and a customer support platform to enable HiveGPT to support and manage HiveGPT’s relationship with the Controller.
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://www.zendesk.com/company/privacy-and-data-protection/
Vendor (Sub-Processor): SalesForce.com, Inc.
Relevant Services: Provides customer relations management software and a customer support platform to enable HiveGPT to support and manage HiveGPT’s relationship with the Controller.
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://www.salesforce.com/content/dam/web/en_us/www/documents/legal/Agreements/data-processing-addendum.pdf
Vendor (Sub-Processor): Google LLC
Relevant Services: Internal Collaboration & Storage / Firebase / Analytics
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://workspace.google.com/terms/dpa_terms.html
Vendor (Sub-Processor): Mailgun Technologies, Inc.
Relevant Services: Email and office applications
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: Available on Request
Vendor (Sub-Processor): SendGrid, Inc.
Relevant Services: Cloud-based email services provider to communicate review request emails to End Users.
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://s3.eu-west-2.amazonaws.com/primalbase/privacy/SEND+GRID+data+contract.pdf
Vendor (Sub-Processor): Microsoft Teams / Office 365
Relevant Services: Microsoft Teams / Office 365 is used for creating roundtables, and SSO
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://www.microsoftvolumelicensing.com/DocumentSearch.aspx?Mode=3&DocumentTypeId=67
Vendor (Sub-Processor): Zoom Video Communications, Inc.
Relevant Services: Provides cloud platform for video, voice, content sharing, and chat runs.
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Vendor DPA: https://zoom.us/docs/doc/Zoom_Data_Processing_Addendum_Processor_Form_Final-SIGNED.pdf
Vendor (Sub-Processor): Atlassian Pty Ltd (Jira)
Relevant Services: Internal support collaboration
Corporate Location: US
International Transfer Mechanism: Standard Contractual Clauses
Responsible Disclosure
If you find a vulnerability in HiveGPT LLC's information systems, you may report it to privacy@hivegpt.ai. The report must contain enough information that the vulnerability can be repeated.
Performance and Availability
Performance and Scaling
HiveGPT is hosted on public cloud infrastructure. Services are deployed in multiple availability zones and regions and are designed to scale dynamically in response to measured and expected loads. Cloudfront is used to distribute services globally. Simulated load tests and API response time tests are integrated with HiveGPT’s release and testing cycles. HiveGPT maintains a publicly available status webpage that details system availability categorized into product areas, scheduled maintenance windows, service incident history, and security incident details. HiveGPT employs an Autoscaling Group that automatically terminates a faulty instance in one hosting location and transitions to a new healthy instance at another location. Because services are deployed in different availability zones, business can continue as usual by using other available services. HiveGPT has run events with 25,000 concurrent users and can scale further using Autoscaling to automatically scale up and scale down our infrastructure. HiveGPT currently can scale anywhere from 10k to 100k peak user load. The platform can be scaled up or down in response to demand in 5 – 10 minutes. All customer environments are separated from one another, so one customer cannot cause an outage for another customer.HiveGPT runs load tests directly on the web application. We set up and run performance tests to maintain the availability of services. We also run network load tests, system load tests, and vulnerability testing on our infrastructure, application, and code before pushing them to production. Cloud Watch alarms and Operations management alert HiveGPT to potential performance problems. Our current mean time to detection of customer-impacting issues is 5 – 10 minutes, and our mean time to resolution of these issues is 15 minutes for system or application-level issues (per our SLA). More extensive issues or fixes may take anywhere from one day to one week.
Disaster Recovery
In the event of a significant regional downturn, HiveGPT can deploy the application to a new hosting region. Our Disaster Recovery Plan maintains and ensures the availability of services and ease of recovery in the case of a disaster. This plan is regularly tested and evaluated for improvement and automation. Disaster Recovery Deployment is handled by the same design and release management processes as our production environment, ensuring that all security settings and controls are correctly implemented. Customer data is replicated to the Azure SQL database by using Data Export Service. HiveGPT’s backup arrangements meet the requirements of business continuity plans. Critical systems are determined and backup arrangements are made to recover all critical system information, applications, and data in the event of a disaster. Automated backup solutions are sufficiently tested prior to implementation and at regular intervals thereafter. Where confidentiality is of importance, backups are protected by means of encryption, and all encryption keys are kept secure at all times with clear procedures in place to ensure that backup media can be promptly decrypted as required. Data recovery operations are only performed by competent, authorized staff. Recovery of data from backups takes no more than 24 – 72 hours, depending on the amount of data.