EU Digital Services Act (DSA) Compliance Transparency
Official platform documentation and governance guidance.
DSA Compliance & Rights Inquiry
Submit a request for technical or policy assistance.
Request Received!
Your ticket hash has been prioritized. Redirecting to your secure terminal...
Enterprise EU Digital Services Act (DSA) Compliance Framework
1. Digital Services Act Legal Mandate
Nexly.biz (the “Company”) is committed to the highest standards of platform accountability as outlined in the EU Digital Services Act (Regulation (EU) 2022/2065). Our mission is to provide a safe, transparent, and user-centric digital environment where illegal content is systematically mitigated and user fundamental rights are structurally protected.
3. User Single Point of Contact (Art. 12)
Recipients of our services can bridge directly with Nexly regarding DSA-related inquiries or illegal content reports. We maintain a non-automated triage system to ensure that user concerns are reviewed with human expert precision.
LegalIntegrity@nexly.biz
4. Legal Representative (Art. 13)
For the purposes of Article 13 of the DSA, Nexly.biz designates the following executive leadership nodes as the responsible representatives within the European digital market:
- Managing Director: Dempsey De Clerck
- Operations Director: Delia Lazarescu
5. Content Moderation & Algorithmic Logic
Our moderation strategy utilizes a hybrid approach: high-fidelity AI filters identify potential safety violations, while Human Specialists conduct final adjudication for complex pedagogical or contextual disputes. We do not utilize fully automated de-platforming without human oversight for high-impact decisions.
6. Notice & Action Mechanism (Art. 16)
Nexly provides a globally accessible "Illegal Content Portal." To ensure a valid notice, please include:
- Logic Explanation: Detailed reasons why the content node is considered illegal.
- Unique Localization: The exact URL or digital fingerprint of the specific content.
- Good Faith Declaration: A statement confirming the accuracy of the submission and your contact credentials.
7. Internal Complaint-Handling System (Art. 20)
Service recipients have the right to appeal any content moderation decision (removal, disabling, account suspension) within six months of the notice. Appeals are processed by an independent internal review board at no cost to the user.
8. Out-of-Court Dispute Settlement (Art. 21)
Users dissatisfied with internal appeal outcomes have the right to select any certified out-of-court dispute settlement body to resolve disputes relating to content moderation. Nexly will cooperate with such bodies in accordance with DSA mandates.
9. Trusted Flaggers (Art. 22)
Reports from "Trusted Flaggers" (as designated by Digital Services Coordinators) are granted priority in our triage sequence. We commit to reviewing and adjudicating these notices with accelerated velocity.
10. Interface Design & Dark Pattern Defense (Art. 25)
Nexly’s UI/UX architecture is engineered to ensure user autonomy. We strictly prohibit "Dark Patterns" that deceive or manipulate users into making unintended choices regarding their data, subscriptions, or content interactions.
11. Advertising Transparency (Art. 26)
For every advertisement presented on our platform, Nexly ensures immediate transparency by disclosing:
- That the content is an advertisement.
- The identity of the advertiser.
- The meaningful logic behind the targeting parameters used.
12. Recommender Systems (Art. 27)
Our educational recommendation engines function on "Merit & Relevance" logic. Users can review and adjust the primary parameters used in our recommender systems within their "Intelligence Profile" settings to ensure personalized autonomy.
13. Transparency Reporting & Audit Trail
In accordance with Article 15 and 24, Nexly publishes an annual "DSA Transparency Report." This audit includes quantitative data on content removal, moderation staff, and the accuracy of our automated detection systems.
DSA Compliance Command
Response SLA: 48h Legal Review • Protocol v2.1
An error occurred. Please try again later.