- Terms of Services
- Terms of Use for Creators (Appendix 1)
- Terms of Use for Fans (Appendix 2)
- Referral Program Terms (Appendix 3)
- Acceptable Use Policy (Appendix 4)
- Community Guidelines (Appendix 5)
- Report and Complaint Policy (Appendix 6)
- KYC Procedures and Requirements (Appendix 7)
- Relevant Legislations (Appendix 8)
- Anti Human Trafficking Statement (Appendix 9)
Disclaimer As we are offering our Services in different countries, different countries have different laws and regulations to confirm the Age of Majority before accessing our Services, and also the distribution and dissemination of adult contents. It is also our profound concern to ensure compliance with relevant regulatory framework and legislations for the jurisdictions we offer our Services. You hereby expressly warrant and undertake that:
Unless otherwise defined, defined terms used herein shall have the same meanings given to them in the Terms of Use for all Users (the “Terms”). |
-
-
Control of Obscene and Indecent Articles Ordinance (Cap 390 under the laws of Hong Kong)
The Control of Obscene and Indecent Articles Ordinance (Cap 390) (the “COIAO”) is enacted to control articles which consist of or contain materials that is obscene or indecent (including material that is violent, depraved or repulsive). Under the COIAO, “obscenity” refers to violence, depravity and repulsiveness, and it is an offence to publish an obscene article. The definition includes "anything consisting of or containing material to be read or looked at or both read and looked at, any sound recording, and any film, video-tape, disc, or other record of a picture or pictures". The penalty for this offence is up to three years imprisonment and a fine of up to HK$1,000,000.
-
Prevention of Child Pornography Ordinance (Cap 579 under the laws of Hong Kong)
Under the Prevention of Child Pornography Ordinance (Cap 579) (the “PCPO”):
- “Child” means a person under the age of 16 years;
-
“Child Pornography” refers to:
- photograph, film, computer-generated image or other visual depiction that is a pornographic depiction of a person who is or is depicted as being a child, whether it is made or generated by electronic or any other means, whether or not it is a depiction of a real person and whether or not it has been modified; or
- anything that incorporates a photograph, film, image or depiction referred to in paragraph (a) and includes data stored in a form that is capable of conversion into a photograph, film, image or depiction referred to in paragraph (a) and anything containing such data.
It is an offence contrary to section 3 of the PCPO, for any person to:
- print, make, produce, reproduce, copy, import or export any child pornography. The maximum penalty is a fine of HK$2,000,000 and imprisonment for 8 years (on summary conviction a fine of HK$1,000,000 and imprisonment for 3 years);
- Publish any child pornography. The maximum punishment is a fine of HK$2,000,000 and imprisonment for 8 years (or on summary conviction a fine of HK$1,000,000 and imprisonment for 3 years);
- Possess any child pornography (unless he or she is the only person pornographically depicted in the child pornography). The maximum penalty is a fine of HK$1,000,000 and imprisonment for 5 years (or on summary conviction a fine of HK$500,000 and imprisonment for 2 years);
- Publish or cause to be published any advertisement that conveys or is likely to be understood as conveying the message that any person has published, publishes or intends to publish any child pornography. The maximum punishment is a fine of HK$2,000,000 and imprisonment for 8 years (or on summary conviction a fine of HK$1,000,000 and imprisonment for 3 years).
Under the PCPO, a person “publishes” child pornography if he or she distributes, circulates, sells, hires, gives or lends the child pornography to another person; or shows the child pornography in any manner whatsoever to another person including publicly displaying the child pornography in any public street or pier, or public garden or any public place where the public are permitted to have access. Wink has a zero tolerance policy of publishment and promotion of child pornography.
-
Crimes Ordinance
Under section 138A of the Crimes Ordinance, any person who uses, procures or offers another person who is under the age of 18 for making pornography, or for a live pornographic performance, in which that other person is or is to be pornographically depicted, commits an offence and is liable on conviction on indictment— (a) if the offence is committed in relation to a person under the age of 16, to a fine of HK$3,000,000 and to imprisonment for 10 years; (b) if the offence is committed in relation to a person of the age of 16 or above but under 18, to a fine of HK$1,000,000 and to imprisonment for 5 years.
-
Control of Obscene and Indecent Articles Ordinance (Cap 390 under the laws of Hong Kong)
-
-
Age-Appropriate Design Code Act (CAADCA) in 2022
California enacted its Age-Appropriate Design Code Act 1 (the “CAADCA”) in 2022, which requires online services that operate in California that children are “likely to access”—not just services directed at children—to consider the best interests of children when designing products, services, and features and to prioritize children’s privacy, safety, and well-being over commercial interests. These online services must also complete a “Data Impact Assessment” for each new product, service, or feature they offer, determining whether the product, service, or feature could subject children to harmful or potentially harmful content.
The CAADCA imposes this requirement to prioritize children’s privacy, safety, and well-being over commercial interests, on any online service children are likely to access. Before making any new service, product, or feature available to the public, these online services must conduct a “Data Protection Impact Assessment” to determine whether the proposed services and product or any associated algorithms or targeted advertising systems could cause harm to children and how the service, product, or feature uses children’s personal information.
Other provisions in the CAADCA require online services that children are likely to access to estimate the age of child users, configure default privacy settings provided to children to a high level of privacy, provide privacy information and terms of service in clear language suited to children, provide an obvious signal to children when a parent or guardian is monitoring their online activity or tracking their location, enforce published terms of service and community standards, and provide tools to help children exercise their property rights and report concern.
-
Kids Online Safety Act
The Kids Online Safety Act (the “KOSA”) would require online services that are reasonably likely to be used by minors to provide certain parental controls and optional safeguards for minors, including the ability to restrict who can message them and view their profiles, monitor screen time and establish limits, control or opt out of personalized recommendation systems, restrict purchases, manage privacy settings, and report harms that occur. Online services would also have to default minors’ accounts to the strictest privacy and safety settings.
There are transparency requirements stipulated in the KOSA, including requiring online services to provide information for minors and their parents about safeguards, personalized recommendation systems, and advertising. Online services with at least 10 million monthly active users in the US would also have to publish public reports on both potential risks to minors, according to an independent third-party audit, and steps the online service takes to mitigate these risks.
Obscenity Involving Minors
Federal statutes in the US prohibit obscenity involving minors and child pornography, section 2256 of Title 18, United States Code (“USC”) defines child pornography as “ any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age).” Visual depictions include photographs, videos, digital or computer generated images indistinguishable from an actual minor, and images created, adapted, or modified, but appear to depict an identifiable, actual minor. Undeveloped film, undeveloped videotape, and electronically stored data that can be converted into a visual image of child pornography are also deemed illegal visual depictions under federal law. Federal laws prohibit the production, distribution, reception, and possession of an image of child pornography using or affecting any means or facility of interstate or foreign commerce. Some specific examples include:-
Section 1470 of USC
Prohibits any individuals from knowingly transferring or attempting to transfer obscene matter using the U.S. mail or any means or facility of interstate or foreign commerce to a minor under 16 years of age. Convicted offenders face fines and imprisonment for up to 10 years.
Section 1466A of USC
It is illegal for any person to knowingly produce, distribute, receive, or possess with intent to transfer or distribute visual representations, such as drawings, cartoons, or paintings that appear to depict minors engaged in sexually explicit conduct and are deemed obscene. The matter involving minors can be deemed obscene if it (i) depicts an image that is, or appears to be a minor engaged in graphic bestiality, sadistic or masochistic abuse, or sexual intercourse and (ii) if the image lacks serious literary, artistic, political, or scientific value. A first time offender convicted under this statute faces fines and at least 5 years to a maximum of 20 years in prison.
Child Sex Trafficking
Child Sex Trafficking is prohibited by 18 U.S.C. § 1591. It is a federal offense to knowingly recruit, entice, harbour, transport, provide, obtain, or maintain a minor (defined as someone under 18 years of age) knowing or in reckless disregard of the fact that the victim is a minor and would be caused to engage in a commercial sex act. It is illegal both to offer and to obtain a child, and cause child to engage in any kind of sexual activity in exchange for anything of value, whether it be money, goods, personal benefit, in-kind favors, or some other kind of benefit.
1“AB-2273 The California Age-Appropriate Design Code Act,” California Legislative Information, accessed March 21, 2024, https://leginfo.legislature.ca.gov/faces/billCompareClient.xhtml?bill_id=202120220AB2273&showamends=false.
-
Age-Appropriate Design Code Act (CAADCA) in 2022
-
-
The Online Safety Act 2023
Children’s online access to pornographic content on pornography services and services that host user-generated-content, such as social media services, will be regulated by the Office of Communications (the “Ofcom”) under the Online Safety Act 2023 (the “OSA 2023”). OSA 2023 will cover all online services with pornographic content, including commercial pornography sites, social media, and forums, as well as search engines. OSA 2023 strives to ensure that pornographic content will restrict children’s access to it. The OSA 2023 defines a user-to-user service as “an internet service by means of which content that is generated directly on the service by a user of the service or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service”, which is a broad definition and all services that allow users to share content (including but not limited to text, icons, videos and pictures). Services such as social media services, online forums, and private messaging apps are captured by this definition.
Under the OSA 2023, user-to-user services (such as social media platforms) which are likely to be accessed by children will need complete a children’s risk assessment to establish the risk of harm presented to children by their service. Wink has measures in place to mitigate the harm to children presented by harmful content, such as the age verification requirements before accessing our Terms and our Platform, and all of our Users will be above the Age of Majority in the jurisdictions they reside. Search services are also covered by the OSA. Search services that children are likely to access will be required to conduct a child safety risk assessment and take proportionate steps to minimise the risk of children of all ages from encountering pornographic content. In addition, generative artificial intelligence services also meet the definition of a user-to-user service or search service under the OSA 2023.
Ofcom is also the regulator for video-sharing platforms (“VSPs”) established in the UK, as set out in Part 4B of the Communications Act 2003. VSPs are a type of online video service which allow users to upload and share videos with the public and where the platform does not play an editorial role. Although we are not a UK-based VSP, as there is no assurance that our Platform might not fall within the regulatory framework of the CA Act 2003. Ofcom also prescribes measures and guidance for VSPs to follow, in which we will also strive to follow such measures and guidance prescribed by Ofcom from time to time.
-
The Criminal Justice and Immigration Act 2008
The offence at Section 63 of the Criminal Justice and Immigration Act 2008 (the “CJI Act 2008”) criminalises the possession of “extreme” pornographic images and targets pornographic material which is already an offence to publish or distribute under the Obscene Publications Act 1959 (the “OPA”). For material to be in scope it must reasonably be assumed to have been produced solely or principally for the purpose of sexual arousal, and be grossly offensive, disgusting or otherwise obscene. CJI Act 2008 further describes types of images are captured by the offence. These include images explicitly and realistically depicting acts which threaten a person’s life; acts which result or are likely to result in serious injury to a person’s anus, breasts, or genitals; necrophilia; and bestiality, where a reasonable person looking at the images would think that the person or animal was real; and images of non-consensual penetrative sexual activity. The offence applies equally to material published online, broadcast, or published in physical forms and carries a maximum penalty of two years imprisonment. Internet services are exempt from this offence if they are found to host this content but are not aware of it and have not themselves published or distributed it (e.g. promoted or shared it). Wink has zero tolerance policy and our Community Guidelines have strictly prohibited posting and dissemination of content involving serious injury, necrophilia, bestiality and non-consensual penetrative sexual activities.
-
Criminal Justice and Courts Act 2015
There is a range of offences which target the non-consensual sharing of intimate images. These are not targeted at “pornography” specifically but can be applied where an individual’s intimate image or video is published or shared on a pornography site without their consent, for example, the offence of “disclosing private sexual photographs and films with intent to cause distress” at Section 33 of the Criminal Justice and Courts Act 2015 (the “CJC Act 2015”) (often referred to as the “revenge porn” offence).
Section 33 of the CJC Act 2015 targets image-based sexual abuse. The so-called “revenge pornography” offence targets the sharing of, or threat to share, private sexual photographs and films of someone either on or offline without their consent, to cause the person depicted distress. To strengthen image-based sexual abuse offences, the OSA 2023 has created four new offences targeting sharing or threatening to share intimate images, including:
- a ‘base’ offence of sharing an intimate photograph or film without consent or reasonable belief in consent. The maximum penalty is 6 months imprisonment;
- an offence of sharing an intimate photograph or film without consent, with intent to cause alarm, distress or humiliation. The maximum penalty is 2 years imprisonment.
- an offence of sharing an intimate image without consent or a reasonable belief in consent, for the purpose of obtaining sexual gratification (the maximum penalty is 2 years imprisonment);
- offenders may also be subject to notification requirements, commonly referred to as being on the “sex offender’s register”; and
- an offence of threatening to share an intimate image. The maximum penalty is 2 years imprisonment.
These new sharing offences apply to manufactured or altered images (including Deepfakes) for the first time. They also broaden the interpretation of “intimate” to include images that show or appear to show a person who is nude or partially nude or which depict sexual or toileting behaviour.
The Protection of Children Act 1978
The Protection of Children Act 1978 creates an absolute prohibition on the taking, making, distribution, publication, showing and possession with a view to distribution of any indecent photograph (or pseudo-photograph21) of a child under 18. The courts decide in each case whether the material in question is indecent or not. These offences carry a maximum sentence of 10 years’ imprisonment.
Modern Slavery Act 2015
The Act stipulates that a human trafficking offence is if a person arranges or facilitates the travel of another person with a view to that person being exploited. Exploitation is defined as covering sexual exploitation.
The Sexual Offences Act 2003
The Act contains a range of offences to deal with sexual exploitation and sexual abuse in all its forms. These include non-consensual sexual offences, sexual exploitation offences, sexual offences against children, and arranging or facilitating commission of a child sex offence.
-
The Online Safety Act 2023
-
The Digital Services Act
The Digital Services Act (“DSA”) entered into force in the European Union (the “EU”) on 16 November 2022. The DSA aims to complement the rules of the GDPR to ensure the highest level of data protection, and covers online intermediaries and platforms (for example, online marketplaces, social networks, content sharing platforms, app stores, and online travel and accommodation platforms) with the aim to set out a new standard for the accountability of online platforms regarding disinformation, illegal content, and other societal risks. The DSA applies to all online intermediaries and platforms in the EU. The DSA also prescribes provisions for “very large online platforms” (“VLOPs”) and “very large online search engines” (“VLOSEs”), and the DSA imposes cumulative obligations on intermediary services that fall within the definition of: (i) hosting services; (ii) online platforms and marketplaces; and (iii) VLOPs and VLOSEs. Large online platforms of adult content such as Pornhub, XNXX, Stripchat, Xvideos are being designated 2 as VLOPs and VLOSEs. The DSA strives to ensure that:
- transparency of content removal decisions and orders;
- publicly available reports on how automated content moderation is used and its error rate;
- harmonisation of responses to illegal content online;
- less dark patterns 3 online;
- a ban on targeted advertising using sensitive data or the data of minors; and
- greater user transparency on their flow of information, such as information on parameters of recommender systems and accessible terms and conditions.
The DSA also requires platform to have easy-to-use flagging mechanisms for illegal content. Wink has a valid Communication Portal and Report and Complaint Policy for Users to flag and export illegal content. Wink will also strive to ensure that reports and complaints we receive from our Communication Portal are handled by qualified staff, and that the matter is handled in a timely, non-discriminatory manner, and we shall ban any advertising materials and posts on the Platform that use protected data such as sexual orientation, ethnicity, or religion and targeted advertising aimed at minors.
Under the DSA, VLOPs or VLOSEs have to perform an assessment of the risks including disinformation or election manipulation, cyber violence against women, or harms to minors online. Although Wink is not one of the VLOPs and VLOSES, we shall regularly conduct assessment of risks in using our Platform (taking in account of reports and complaints we receive from Users).
2 https://digital-strategy.ec.europa.eu/en/policies/list-designated-vlops-and-vloses
3 “Dark patterns are a way of designing online platforms to trick users into doing things they otherwise would not have considered, often but not always involving money. For example, platforms might trick users into sharing more information than they would otherwise agree to, or they might advertise a cheaper but unavailable product and then direct the user to similar products that cost more. Other examples include tricking users to subscribe to services, hiding or creating misleading buttons, making it difficult to unsubscribe to newsletters and more. The DSA contains an obligation that equates to a ban on using so-called dark patterns on online platforms. Under this obligation, online platforms will have to design their services in a way that does not deceive, manipulate or otherwise materially distort or impair the ability of users to make free and informed decision” https://digital-strategy.ec.europa.eu/en/faqs/digital-services-act-questions-and-answers