Policies
Data Controls
You can access, modify, or delete your data.
Updated 2025-12-31
Written by Lubinpla
This policy sets forth the principles and minimum standards to protect the confidentiality, integrity, and availability of data processed during the operation of the industrial chemistry-specialized AI platform (including projects, documents, assistants, workflows, and Den) provided by Lubinpla (“the Company”). This policy applies to employees and partner personnel (including contractors and freelancers), systems, accounts, networks, storage, and all tasks where customer data is processed.
The data analysis and service improvement provisions in this policy are aimed at the Company's security operations, incident response, functionality improvements, abuse prevention, and quality enhancement, and include criteria to limit the use of customer data for purposes other than those intended.
1. Definitions of Terms and Data Classification
1.1 “Customer Content” refers to the content uploaded, input, or generated by the customer or user on the platform, including documents/attachments stored in Den, the body and attachments of Notes, the body of Assistant conversations, project descriptions/memos, and other free-text entries.
1.2 “Structured Project Data” refers to data recorded and managed in the form of forms/fields within a project, such as project creation information, product lines/categories, condition items (name/value/unit/description), change history, and product usage records (product name/quantity/date/status), etc.
1.3 “Usage & Telemetry Data” refers to technical and operational data generated for the provision and protection of the service, including feature usage events, performance/stability metrics, error codes, session/device-level metadata, access logs, abuse detection signals, etc.
1.4 “Content-derived Features” refers to features that derive information by interpreting the meaning of text/documents, such as summarizing, classifying, tagging customer content, or evaluating the structural completeness of troubleshooting, generating derived information (e.g., classification of note types, detection of missing troubleshooting steps, quality scores, etc.).
2. Basic Principles for Language Models and Third-party Processing
2.1 The Company’s fundamental principle is not to use input and output data processed through APIs or service screens (including Project/Note/Den and equivalent customer content) for training (learning) purposes on foundation models (general-purpose base models) or other global models operated by the Company or third parties. However, if the customer or organizational administrator explicitly opts in (provides prior consent), minimal data may be used within the scope of that consent for model quality improvements or safety/performance evaluations, and the scope, purpose, processing methods, retention period, withdrawal (opt-out) process, and scope of withdrawal will be separately notified.
2.2 Even when using external model providers (such as OpenAI, Anthropic, Google, etc.), the Company will prioritize applying channels and settings that align with the basic principles of these channels (no use of input/output for model learning, with exceptions only when opting in).
2.3 The Company may review the application of alternative provision paths to strengthen data separation levels, such as region/data zone selection, tenant isolation, storage/logging options, and network control, in accordance with customer security requirements or contractual conditions. However, these measures are not basic features provided to all customers and may vary depending on technical/operational availability and separate agreements (or security addenda).
2.4 The Company acknowledges that the data processing terms and conditions of external model providers may change and will continuously monitor and manage them through contracts, documents, and technical settings (e.g., storage options, minimized logging, access control, network boundaries). If any changes are identified that could substantially affect the processing of customer data, the Company will notify customers within a reasonable range and may consider applying alternative paths or adjusting conditions if necessary.
3. Encryption and Security Controls
3.1 The Company applies appropriate encryption and security controls to data during transmission and storage. Industry-standard security protocols (e.g., TLS-based) are applied during transmission, and the design ensures that data is not exposed during this process.
3.2 The Company applies encryption at rest when data is stored in databases and related storage to reduce the risk of unauthorized access and leakage at the storage media level. Encryption at rest is an additional measure to enhance protection while the data is stored, and is operated with additional access controls.
3.3 Encryption keys are managed through a secure key management system, and access rights are limited to the necessary scope for business purposes. In addition, administrative and technical protective measures such as access control, rights management, and monitoring are in place to prevent and respond to security incidents.
4. Data Analysis and Service Improvement Policy
4.1 The Company may collect, process, and analyze data within the minimum scope necessary for the stable operation and improvement of its products and services. In doing so, (i) customer organization boundaries are respected, (ii) analysis is generally conducted in an aggregated or anonymized form, and (iii) data is not disclosed or provided to third parties in a form that could re-identify customers unless explicitly authorized.
4.2 For service operation and product improvement, the Company may collect and analyze some of the structured project data in a minimal scope. This may include the following items:
Project
-
Operational events and history: Changes in project creation, modification, and archiving status, change history of condition/product records
-
Input quality indicators: Missing required values, unit/format errors, validation failures, outlier detection, etc.
-
Usage distribution (aggregated): Frequency/distribution of categories/condition items, feature usage flow (aggregated statistics)
-
Product usage records (aggregated): Existence of usage history, status changes, input error rates, usage flow, etc.
-
Schema/structure information: Field names, types, required status, choice configurations, unit/format, input value ranges (validation rules), and change history
-
Other equivalent information: Information reasonably necessary for service provision, operation, security, and quality improvement within the same scope as the above
Note
-
Operational events: Note creation, modification, deletion, project linking (tagging) events
-
Tag usage (aggregated): Frequency/distribution of hashtag use, input patterns (aggregated)
-
Attachment metadata (excluding original content): Upload success/failure, file format/size ranges, etc.
-
Search/explore quality: Search/filter usage patterns, search failures/delays, result click rates, etc.
-
Stability/security signals: Error codes, abuse detection signals, etc.
Den
-
Document/folder creation, modification, movement, deletion, restoration events (operational events)
-
Upload/download/preview/view events and success/failure rates (e.g., upload failure codes, timeouts)
-
File format/extension, file size (capacity range), page count, etc. (metadata excluding content)
-
Version management/replacement uploads/link sharing (including internal sharing) events and usage patterns
-
Permission/share setting change events (e.g., access grant/revocation, link expiration settings) and access denial (403, etc.) patterns
-
Search/filter/folder navigation usage patterns (e.g., search term length/frequency, search result click rates) and search failures/delays patterns
-
Malicious file detection signals (e.g., malware scan results), abnormal mass downloads, etc.
Workflow
-
Execution/step logs: Execution time/count, step-by-step success/failure/retry/abort, error codes
-
Performance/stability metrics: Step-by-step execution time/latency, queue congestion, timeouts, rate limits, etc.
-
Trigger reliability: Condition/schedule/event trigger operation status, missing/duplicate execution patterns
-
Schema/validation (excluding original content): Input/output schemas (field names/types/required status), mapping/validation failures (format errors)
-
Permissions/approvals/notifications metrics: Approval requests/approvals/rejections, execution permission errors, notification delivery success/failure rates and delays (minimizing personal identifiable information)
-
Security/abuse signals: Abnormal call volumes, repeated failure patterns, etc.
4.3 The collection and analysis results under this section will be used solely for internal purposes related to service provision and improvement and will generally be processed in an aggregated or anonymized form. The Company does not use customer data for purposes such as comparing, benchmarking, or ranking customers between tenants, and will not disclose or provide such data in a form that re-identifies a specific customer unless the customer has given prior consent or there is a legal obligation.
4.4 The Company may segment project content to generate and store embeddings (vector representations) or search indexes (keywords/semantic indexes, etc.) for functions such as internal search, recommendations, evidence-based responses, and quality checks within the customer organization. This process may include freely inputted customer text (e.g., field names/memos arbitrarily created by customers) and will only be processed within a limited scope.
4.5 The Company applies the principle of least privilege for access to analytical data (including derived indicators), and will record and manage access/processing activities to make them auditable. Role-based access control, access approval procedures, anomaly monitoring, and other administrative/technical protective measures are applied as needed, ensuring that customer data is not exposed to other customers through tenant isolation and access controls.
4.6 The Company can collect and analyze performance, stability, error rates, and functional success rates for quality assessment purposes to improve the safety, accuracy, and quality of agents during service usage. The basic scope of data collection and analysis for these assessments will be limited to aggregated or anonymized operational/quality indicators, and will not include customer content in its original form or any information that could re-identify specific customer field/processes.
4.7 Based on the customer's security, regulatory, and internal control requirements, the Company may define and mutually agree upon specific data processing and usage scopes (e.g., log levels, retention periods, content-based evaluation, third-party processing routes) through additional agreements (e.g., separate agreements, Data Processing Agreements (DPA), etc.) between enterprises. This additional agreement typically applies to Enterprise customers, and data will only be processed and utilized within the mutually agreed-upon scope.
5. Retention Period, Deletion, and Data Minimization
5.1 The Company sets retention purposes and periods for each type of data and strives to ensure that data is not retained beyond the necessary minimum period to fulfill its purposes. Service usage data and raw logs are retained within the limited scope necessary for security, incident response, and operational purposes, and can be converted into aggregated or anonymized statistics or deleted in the long term.
5.2 Retention and deletion of customer content and structured project data will be handled according to the customer contract, functional necessity, and legal obligations, and specific retention criteria (e.g., return/deletion/retention grace period) may be defined upon customer request or contract termination.
6. Policy Changes and Continuous Monitoring
6.1 The Company may revise this policy in response to service changes, legal and external provider policy changes, or changes in security threats, and will notify customers in a reasonable manner in the case of significant changes.
6.2 The Company will continuously monitor compliance with data processing and operate monitoring systems to prevent and respond to security incidents.