
"Human Factor:" the psychological and ethical focus, as participants analyze human behavior, relationships, and mental health (PTSD, depression).
The recording of this twitter space features a multifaceted, multilingual discussion (primarily Spanish and English) spanning personal trauma, professional ethics, and controversial social issues.The conversation begins on a serious, empathetic note as participant Big Crazy reports a factory explosion near his location (Suacha, Colombia), where he had friends working and was scheduled for an interview. Host Alberto Daniel Hill and the group offer immediate, deep emotional support, establishing a tone of solidarity and concern.The host, Alberto (from Uruguay), recounts his defining traumatic experience: his 2018 arrest and imprisonment for reporting a vulnerability (admin/admin) in a medical system. He details how the police and media mischaracterized him as a "ciberterrorista" and "criminal," leading to the seizure of his electronic equipment. This trauma resulted in Post-Traumatic Stress Disorder (PTSD) and severe depression.While incarcerated, he suffered a medical event, resulting in a three-hour coma. He views this near-death experience, where he overheard doctors discussing his potential death, as his "despertar" (awakening), changing him from an "egoist" to someone dedicated to helping others. His story was popularized by The Dark Diaries podcast, reaching over 1 million listeners, which he used as a springboard to teach ethical hacking and write his book, Login to Hell.The participants engage in deep technical dialogue, often distinguishing between a "hacker" (a skilled individual who can protect systems) and a "cracker" or "criminal".Cybersecurity Philosophy: Alberto advocates for ethics as the foundation of cybersecurity. Alexo stresses the programmer's need to focus on security by design, noting that development often neglects security measures, making systems vulnerable to attacks like SQL injection.Knowledge Sharing: The conversation highlights the value of self-learning (self-learner) and open communities like OWASP and Linux distributions (Cali, Backtrack) for professional growth.Global Relevance of English: English is repeatedly recognized as the "language of the word of business" and essential for professional advancement, information access, and global communication.The space frequently discusses polarizing social and psychological topics:Dark Personalities and Relationship Dynamics: Silan, who embraces his "rasgos oscuros de personalidad", champions "cultos egoístas". He describes seeking submissive (sumisa) partners and states that he lacks empathy. Adriana challenges this, labeling it an unhealthy, toxic pattern and suggesting that repeating destructive relationship patterns may stem from internal psychological issues.Sexualization and Generalization: Silan repeatedly asks broad, often vulgar questions about female sexuality. Adriana rejects generalizations about women (such as whether Colombian women are "fáciles"). She connects the desperation observed in places like Cartagena and Cuba to social inequality and poverty that normalize sex work and tourism.Gender Bias in Tech: There is consensus that women are severely underrepresented in IT and engineering. Reasons cited include cultural biases, corporate hesitations (e.g., cost of maternity leave), and frequent sexual harassment and discomfort experienced by women seeking to advance in male-dominated fields.Multicultural Participants: The space includes speakers from diverse countries, including Uruguay, Colombia, Spain, Mexico, USA, Italy, and Finland, confirming the multinational context of the discussion. Scott, an American participant, notes his personal struggle to practice Spanish because others instantly switch to English to accommodate him.I. Crisis and Emotional ContextII. Alberto Daniel Hill's History and RedemptionIII. Technology, Ethics, and ProfessionalismIV. Social Dynamics and Controversy
24 Sep 12min

Digital Discussions and Cybersecurity and Italian big brother
The source provides an extensive transcript of an online audio space conversation involving several participants, primarily a speaker identified as Alberto and another called Silan, along with a DJ named Rori and brief appearances by Silan's grandmother and an Italian guest. The discussion rapidly shifts between personal anecdotes, cyber security concepts, pop culture, and humor, often focusing on Silan's academic stresses, Alberto's past, and various social interactions. A notable segment features Alberto interviewing Silan's grandmother about her musical preferences, while other parts touch on cybersecurity roles like threat hunting, an alleged encounter with a "Big Brother" contestant from Italy, and lighthearted, combative banter between Alberto and Silan. The conversation concludes with Alberto and Silan discussing current events and life advice, often using highly informal and provocative language.
24 Sep 22min

x (Twitter) spaces recording (I'm not speaking in this in a recorded space") - The_Privacy_Placebo.
The analysis of the X Spaces discussions from September 22nd and 23rd reveals a fundamental conflict between pervasive digital surveillance and deep human vulnerability, explored through technical exposé and intimate conversation.**I. The Architecture of Surveillance (Sept 22nd Analysis)**The core finding is the **"paradox of digital trust,"** the tension between the intimate feel of live audio chats and the reality of data collection on a huge scale. Technical analysis showed that the platform operates on an **"always-on"** paradigm: **every single space is recorded by X, always**. The host's control button to stop recording is a **"privacy placebo,"** a user-interface fiction that offers an **"illusion of control"** designed to placate users and encourage free speech, even though the platform's backend continues to capture and store everything.The scale of extraction is vast; one expert noted the ability to access **233,000 different recordings** directly from the platform’s servers. The underlying commercial imperative is **"surveillance capitalism,"** where private human experience is commodified and extracted as **"free raw material"** for commercial benefit and selling ads.The intended use of this massive dataset involves intrusive algorithmic profiling:1. **AI Personality:** Training synthetic voices to inject personality and make them less **"bland"**.2. **Psychological Profiling:** Analyzing audio for **"various emotional tones and various types of aggression"**.3. **Predictive Health:** Correlating voice samples with biometric data to **"predict if we will have a heart rate failure"** or other medical issues—a dangerous practice of unvalidated medical prediction.These activities are in direct contravention of the GDPR and are on a collision course with the EU AI Act, particularly regarding the processing of special category data (health) and high-risk applications like emotion recognition.**II. Profiles, Psychology, and Interaction**The community consists of highly technical and aware participants. They are **hyper-aware of the platform's shortcomings** and the persistent surveillance, often treating it with **"performative cynicism"** and a **"sense of resignation,"** yet the value of community connection outweighs the known risks.**Key Speakers:*** **Alberto Daniel Hill:** The central "hub". A recognized cyber security expert from Uruguay. His experience includes being wrongfully imprisoned (his "personal 9/11" was September 11, 2017). This trauma informs his advocacy.* **Masha:** Expressed **severe tech anxiety and phobia** due to the constant stream of bad news and the emotional toll of **waking up to death threats every day** and enduring harassment.* **@Oelma Alma (Velma):** Described as a **"cyber warrior"** with **PII phobia** (fear of identifiable information release), highlighting intense focus on digital privacy, often manifested through dark, geopolitical humor.**Interaction and Vulnerability (Sept 23rd Audio):**The audio content was deeply personal and focused on coping with trauma. Alberto was **"extremely open"** about his Complex PTSD and depression, encouraging other men not to hide their struggles. Masha advocated for **"peace in action,"** proposing anonymous blood donation in public squares as a positive, life-saving counter-protest to violence and disruption.The discussion touched on writing as a response to trauma, the vulnerability of women facing online abuse, and moral philosophy, using the Mike Ehrmantraut character from *Breaking Bad* to debate the necessity of **"full measures"** (extreme action) when **"half measures"** fail to protect loved ones. Alberto stated he would use full measure (kill) to protect his daughter. Despite the technical knowledge that their vulnerable sharing is being recorded and profiled, the community remains a vital source of peer support, symbolizing the human need for connection overriding intellectual defense.
23 Sep 7min
![The_Great_Deception__Why_X_Spaces_Record_Everything_and_the_Eth [1].mp3](https://cdn.podme.com/podcast-images/0066A8DC7BF5100249B3C8D07488F2F1_small.jpg)
The_Great_Deception__Why_X_Spaces_Record_Everything_and_the_Eth [1].mp3
The discussion around a certain platform's data practices reveals profound ethical and technical concerns, particularly regarding non-consensual use of intimate user data, invasive profiling, and opaque recording practices. These issues raise questions about privacy, trust, and the exploitation of personal information for commercial and AI development purposes.**Ethical Concerns**1. **Non-Consensual Use of Intimate Data (Surveillance):** The platform reportedly records all interactions in its "spaces," regardless of privacy settings. A technical expert claims access to over 233,000 recordings stored on centralized servers, undermining user expectations of privacy. Even when hosts disable recording reminders to create a sense of comfort, the constant surveillance persists, violating explicit consent and eroding trust.2. **Intrusive Profiling and Diagnosis:** The use of AI, including GPTs and unsupervised machine learning, to analyze audio for emotional tones, aggression, and other personal traits is deeply invasive. This profiling extends beyond basic identification, delving into sensitive psychological and behavioral characteristics without user consent, raising significant ethical red flags.3. **Prediction of Medical Issues:** The conversation highlights the platform's potential to correlate voice data with medical records (e.g., from smartwatches) to predict health issues like heart rate failure. This unauthorized health profiling crosses severe ethical boundaries, as users are unaware their conversational data could be used for such sensitive purposes.4. **Exploitation for Commercial Gain:** The platform’s primary motive appears to be commercial monetization, with user interactions exploited for ad revenue rather than fostering genuine communication. This cynical approach prioritizes profit over user benefit, further eroding trust in the platform’s intentions.5. **Obfuscation of Identity and Trust:** Weak verification standards, allowing users to purchase badges with "burner credit cards" and fake addresses, compromise the authenticity of identities. This lack of trust enables potential malicious profiling or engagement by unverified users, further undermining the platform’s integrity.**Technical Implications**1. **AI Development: Enhancing Personality and Realism:** The platform leverages conversation data to train AI, particularly to inject "personality" into synthetic voices. The emotional richness of real-world audio makes it highly valuable for creating more human-like GPT voices, which are currently described as "bland."2. **Unsupervised Machine Learning for Trait Extraction:** Advanced algorithms analyze audio to quantify emotional and aggressive tones automatically. This unsupervised machine learning enables the platform to extract complex user characteristics, which are then used to enhance AI models.3. **Data Access and Storage Vulnerability:** The ease with which a single user can access vast amounts of stored recordings highlights significant vulnerabilities in the platform’s data storage and access controls. This exposes sensitive audio and transcripts to potential breaches.4. **Automation of Profiling:** The platform’s system links speakers directly to transcriptions, enabling automated, detailed profiling of users’ personalities and traits. This technical capability amplifies the ethical concerns surrounding unauthorized data use.-------------This episode of Cybermidnight Club explores. Hosted by Alberto Daniel Hill, a cybersecurity expert who was the first hacker unjustly imprisoned in Uruguay.➡️ Explore All Links, Books, and Socials Here: https://linktr.ee/adanielhill➡️ Subscribe to the Podcast: ``https://podcast.cybermidnight.club➡️ Support the Mission for Digital Justice: https://buymeacoffee.com/albertohill#CybermidnightClub #Cybersecurity #Hacking #TrueCrime
23 Sep 13min

The_Great_Deception__Why_X_Spaces_Record_Everything_and_the_Ethical...
The conversation within this digital social space reveals significant ethical and technical implications when leveraging conversation data for AI development and user profiling, highlighting issues of consent, surveillance, and predictive technology.The ethical concerns revolve primarily around the non-consensual use of intimate and highly personal data, the potential for discriminatory profiling, and the platform's obfuscation of true recording practices.**Non-Consensual Use profiling, and the platform's obfuscation of true recording practices.Non-Consensual Use of Intimate Data (Surveillance): The conversation confirms that "all spaces are recorded anyways," regardless of the visible privacy toggles. This constant recording violates the expectation of privacy, even if the host attempts to turn off the recording reminder for comfort. The technical expert, T, claims he can access the recordings of over "233,000 spaces right now" from the platform's server, confirming that the data is stored centrally and accessible, undermining any notion of privacy or explicit consent.Intrusive Profiling and Diagnosis: A major ethical implication is the discussion of leveraging AI (GPTs and unsupervised machine learning) to analyze the audio for deeply personal traits. This analysis extends beyond mere identification to include "various emotional tones and various types of aggression".Prediction of Medical Issues: The speakers discuss the highly invasive possibility of correlating voice samples with medical records (e.g., from smartwatches) to "predict if we will have a heart rate failure or if we have some other medical issues". This practice crosses a severe ethical boundary by potentially using conversational data for unauthorized, sensitive health profiling and prediction.Exploitation for Commercial Gain: The underlying cynical view of the platform—that it is "not about us communicating. It's about them selling ads"—implies that users' personal stories and interactions are exploited primarily for commercial monetization rather than for the benefit of the users.Obfuscation of Identity and Trust: The weak verification standards, which allow users to pay for badges using "burner credit cards" and fake addresses, raise ethical questions about identity authenticity and trust within the space, as malicious profiling or engagement could be undertaken by unverified users.Technically, leveraging conversation data is driven by the goal of enhancing AI with "personality" and the need for massive data sets, while demonstrating the security risks inherent in data storage and platform architecture.AI Development: Enhancing Personality and Realism: The key technical motivation for using this data in AI development is to inject "personality" into synthetic voices. T notes that current GPT voices are "so bland" and that conversation data from spaces would be "great for it" because the audio samples are "contained most often times". This suggests that real-world, emotional conversation data is highly valuable for training AIs to sound more human.Unsupervised Machine Learning for Trait Extraction: The technical discussion details the use of unsupervised machine learning algorithms to allow emotional and aggressive tones to be "self quantified". This technical approach indicates that complex data analysis is being performed on the raw audio to extract characteristics that are then used to train GPTs.Data Access and Storage Vulnerability: The fact that a single technical user can claim access to hundreds of thousands of space recordings stored on the platform's servers highlights a major technical vulnerability in data storage and access controls. This ease of data retrieval means that the sheer volume of conversation data (including transcripts and audio) is at constant risk of exposure.
23 Sep 13min

All twitter spaces are recorded! Your Live Audio Is_Already Recorded - Why_Privacy_Toggles_Fail
The core debate regarding "Your_Live_Audio_Is_Already_Recorded__Why_Privacy_Toggles_Fail" is thoroughly explored by the speakers, particularly the host, @ADanielHill (Alberto), and the technical expert, T (or DT), highlighting a stark contrast between user perception of platform privacy and the technical reality of data retention and access.This debate touches upon platform design cynicism, technical capabilities for data access, and the potential malicious use of collected audio data.The debate is sparked by Alberto's discomfort with the visible "recording reminder" and his attempts to switch it off, which repeatedly clashes with T's technical assertion that the data is being captured regardless of the displayed toggle.@ADanielHill (Alberto Daniel Hill) expresses a preference for the absence of the visible recording indicator, equating its removal with a feeling of being able to speak freely:Alberto states, "I don't like to see that recording reminder. I feel more comfortable" when it is off.He jokingly declares: "Oh, the space is no longer recorded. Okay, now we can talk. Yeah. How's business?".When the recording status is successfully toggled off after some technical glitches, he exclaims, "Now this this space is no longer recorded. Now we can talk about anything but all the spaces are recorded".The technical speaker, T (or DT), instantly and consistently overrides Alberto's perception, using the phrase that serves as the basis for the debate title:T states immediately, "all spaces are recorded anyways".Alberto even notes that T has been "stealing my line about all spaces are recorded", indicating this is a frequent, established point of discussion among them.T justifies this technical reality by stating that the platform's primary goal is not communication but commercialization: "Twitter is not about us communicating. It's about them selling ads".T provides specific details demonstrating why privacy toggles fail and why the recordings are accessible, confirming that the data is not only being collected but can be retrieved by those with technical know-how:T explains how to access the recordings by performing a "trick with the horse and get the ID".T boasts having access to the recordings of "233,000 spaces right now".When Alberto asks if T has the recordings, T clarifies, "I don't have them. They're on X's server, but I can access them".The group observes that toggling the recording on and off causes platform instability, suggesting the function is a flawed overlay on a constantly running system: "I think whenever recording's turned on and off, I think it f** the space a little bit"**.The collected audio data is discussed not just as a privacy violation but as a security vulnerability that can be exploited, especially through advanced machine learning:Speakers discuss the potential for using unsupervised machine learning and GPTs (Generative Pre-trained Transformers) to analyze the audio for "the various emotional tones and various types of aggression".They hypothesize about correlating voice samples with medical records to "predict if we will have a heart rate failure or if we have some other medical issues".One speaker even jokes that by clicking a link to claim an "eight days clean" coin, the platform would "get your medical record along with your bank account".The technical speaker T suggests that these massive audio data sets are valuable for training AIs to have more "personality" than the current "bland" GTP voices.The Dynamics of the Recording Debate1. The Host's Desire for Privacy (Perception)2. The Technical Expert's Assertion (Reality)3. Technical Vulnerability and Data Access4. The Risk of Data Misuse (Security Profile)
22 Sep 5min

Conversations on Social Spaces, Security, and Life2 sources.
The source provides an extensive transcript of a casual and often humorous conversation occurring within an X (formerly Twitter) Space, featuring multiple participants discussing a wide array of personal and technical subjects. Topics range from social media interactions like unfollowing and the functionality of X Spaces to personal anecdotes concerning medical procedures and the consumption of alcohol and cigarettes. A significant portion of the dialogue focuses on cybersecurity, with participants offering advice to a new speaker on learning the field, discussing educational resources, and examining a technical article about an EDR freeze vulnerability. The speakers also engage in lighthearted banter, including discussions about singing Queen songs, the peculiarities of online bots, and the personal philosophy of living in isolation.
22 Sep 6min