Longevity gene therapy and AI – What is on the horizon?

Longevity Gene Therapy

Longevity gene therapy and AI – What is on the horizon?

Gene therapy holds promise for extending human lifespan and enhancing healthspan by targeting genes associated with aging processes. Longevity gene therapy, particularly interventions focusing on genes like TERT (telomerase reverse transcriptase), Klotho, and Myostatin, is at the forefront of experimental research. Companies such as Bioviva, Libella, and Minicircle are pioneering these interventions, albeit with varying degrees of transparency and scientific rigor.

TERT, Klotho, and Myostatin in Longevity

  • TERT: The TERT gene encodes for an enzyme essential in telomere maintenance, which is linked to cellular aging. Overexpression of TERT in model organisms has shown potential in lengthening telomeres, potentially delaying aging.
  • Klotho: This gene plays a crucial role in regulating aging and lifespan. Klotho protein has been associated with multiple protective effects against age-related diseases.
  • Myostatin: Known for its role in regulating muscle growth, inhibiting Myostatin can result in increased muscle mass and strength, which could counteract some age-related physical decline.

The Experimental Nature of Longevity Gene Therapy

The application of gene therapy for longevity remains largely experimental. Most available data come from preclinical studies, primarily in animal models. Human data are scarce, raising questions about efficacy, safety, and potential long-term effects. The ethical implications of these experimental treatments, especially in the absence of robust data, are significant, touching on issues of access, consent, and potential unforeseen consequences.

Companies Offering Longevity Gene Therapy

  • Bioviva: Notably involved in this field, Bioviva has been vocal about its endeavors in gene therapy for aging. While they have published some data from mouse studies, human data remain limited.
  • Libella and Minicircle: These companies also offer longevity gene therapies but face similar challenges in providing comprehensive human data to back their claims.

Industry Perspective vs. Public Discourse

The discourse around longevity gene therapy is predominantly shaped by those within the industry, such as Liz Parrish of Bioviva and Bryan Johnson. While their insights are valuable, they may also be biased towards promoting their interventions. The lack of widespread discussion on platforms like Reddit and Twitter, especially from independent sources or those outside the industry, points to a need for greater transparency and peer-reviewed research.

Longevity Gene Therapy
Longevity Gene Therapy

Ethical and Regulatory Considerations

The ethical and regulatory landscape for gene therapy is complex, particularly for treatments aimed at non-disease conditions like aging. The experimental status of longevity gene therapies raises significant ethical questions, particularly around informed consent and the potential long-term impacts. Regulatory bodies are tasked with balancing the potential benefits of such innovative treatments against the risks and ethical concerns, requiring a robust framework for clinical trials and approval processes.

Longevity Gene Therapy and AI

Integrating Artificial Intelligence (AI) into longevity gene therapy represents a groundbreaking intersection of biotechnology and computational sciences. AI and machine learning algorithms are increasingly employed to decipher complex biological data, predict the impacts of genetic modifications, and optimize therapy designs. In the context of longevity gene therapy, AI can analyze vast datasets from genomics, proteomics, and metabolomics to identify new therapeutic targets, understand the intricate mechanisms of aging, and predict individual responses to gene therapies. This computational power enables researchers to simulate the effects of gene editing or modulation before actual clinical application, enhancing the precision and safety of therapies. Furthermore, AI-driven platforms facilitate the personalized tailoring of gene therapy interventions, taking into account the unique genetic makeup of each individual, which is crucial for effective and minimally invasive treatment strategies. The synergy between AI and longevity gene therapy accelerates the pace of discovery and development in this field, promising more rapid translation of research findings into clinical applications that could extend human healthspan and lifespan.

Moving Forward

For longevity gene therapy to advance from experimental to accepted medical practice, several key developments are needed:

AI Unraveled: Demystifying Frequently Asked Questions on Artificial Intelligence (OpenAI, ChatGPT, Google Bard, Generative AI, Discriminative AI, xAI, LLMs, GPUs, Machine Learning, NLP, Promp Engineering)
  • Robust Human Clinical Trials: Rigorous, peer-reviewed clinical trials involving human participants are essential to establish the safety and efficacy of gene therapies for longevity.
  • Transparency and Peer Review: Open sharing of data and peer-reviewed publication of results can help build credibility and foster a more informed public discourse.
  • Ethical and Regulatory Frameworks: Developing clear ethical guidelines and regulatory pathways for these therapies will be crucial in ensuring they are deployed responsibly.

The future of longevity gene therapy is fraught with challenges but also holds immense promise. As the field evolves, a multidisciplinary approach involving scientists, ethicists, regulators, and the public will be crucial in realizing its potential in a responsible and beneficial manner.

Longevity gene therapy and AI: Annex

What are the top 10 most promising potential longevity therapies being researched?

I think the idea of treating aging as a disease that’s treatable and preventable in some ways is a really necessary focus. The OP works with some of the world’s top researchers using HBOT as part of that process to increase oxygen in the blood and open new pathways in the brain to address cognitive decline and increase HealthSpan (vs. just lifespan). Pretty cool stuff!

HBOT in longevity research stands for “hyperbaric oxygen therapy.” It has been the subject of research for its potential effects on healthy aging. Several studies have shown that HBOT can target aging hallmarks, including telomere shortening and senescent cell accumulation, at the cellular level. For example, a prospective trial found that HBOT can significantly modulate the pathophysiology of skin aging in a healthy aging population, indicating effects such as angiogenesis and senescent cell clearance. Additionally, research has demonstrated that HBOT may induce significant senolytic effects, including increasing telomere length and decreasing senescent cell accumulation in aging adults. The potential of HBOT in healthy aging and its implications for longevity are still being explored, and further research is needed to fully understand its effects and potential applications.

If you are looking for an all-in-one solution to help you prepare for the AWS Cloud Practitioner Certification Exam, look no further than this AWS Cloud Practitioner CCP CLF-C02 book

2- Are they also looking into HBOT as a treatment for erectile dysfunction?

Definitely! Dr. Shai Efrati has been doing research around that and had a study published in the Journal of Sexual Medicine. Dr. Efrati and his team found that 80% of men “reported improved erections” after HBOT therapy: https://www.nature.com/articles/s41443-018-0023-9

3- I think cellular reprogramming seems to be one of the most promising approaches https://www.lifespan.io/topic/yamanaka-factors/

4-Next-gen senolytics (eg, Rubedo, Oisin, Deciduous).

Cellular rejuvenation aka partial reprogramming (as someone else already said) but not just by Yamanaka (OSKM) factors or cocktail variants but also by other novel Yamanaka-factor alternatives.

Stem cell secretions.

Treatments for aging extra-cellular matrix (ECM).

5- Rapamycin is the most promising short term.

I see a lot of people saying reprogramming, and I think the idea is promising but as someone who worked on reprogramming cells in vitro I can tell you that any proof of concepts in vivo large animal models is far aways.

6- Blood focused therapies ( dilution, plasma refactoring, e5, exosomes) perhaps look at yuvan research.

7- I think plasmapheresis is a technology most likely to be proven beneficial in the near term and also a technology that can be scaled and offered for reasonable prices.

8- Bioelectricity, if we succeed in interpreting the code of electrical signals By which cells communicate , we can control any tissue growth and development including organs regeneration

Djamgatech: Build the skills that’ll drive your career into six figures: Get Djamgatech.

9- Gene therapy and reprogramming will blow the lid off the maximum lifespan. Turning longevity genes on/expressing proteins that repair cellular damage and reversing epigenetic changes that occur with aging.

10- I don’t think anything currently being researched (that we know of) has the potential to take us to immortality. That’ll likely end up requiring some pretty sophisticated nanotechnology. However, the important part isn’t getting to immortality, but getting to LEV. In that respect, I’d say senolytics and stem cell treatments are both looking pretty promising. (And can likely achieve more in combination than on their own.)

11- Spiroligomers to remove glucosepane from the ECM.

12- Yuvan Research. Look up the recent paper they have with Steve Horvath on porcine plasma fractions.

13- This OP thinks most of the therapies being researched will end up having insignificant effects. The only thing that looks promising to me is new tissue grown from injected stem cells or outright organ replacement. Nothing else will address DNA damage, which results in gene loss, disregulation of gene expression, and loss of suppression of transposable elements.

14- A couple that haven’t been mentioned:


  • The killer T-cells that target MR-1 and seem to be able to find and kill all common cancer types.

  • Also Maia Biotech’s THIO (“WILT 2.0”)

Mitochondria: Mitochondrial infusion that lasts or the allotopic expression of the remaining proteins SENS is working on.

15- Look for first updates coming from altos labs.

Altos Labs is a biotechnology research company focused on unraveling the deep biology of cell rejuvenation to reverse disease and develop life extension therapies that can halt or reverse the human aging process. The company’s goal is to increase the “healthspan” of humans, with longevity extension being an “accidental consequence” of their work. Altos Labs is dedicated to restoring cell health and resilience through cell rejuvenation to reverse disease, injury, and disabilities that can occur throughout life. The company is working on specialized cell therapies based on induced pluripotent stem cells to achieve these objectives. Altos Labs is known for its atypical focus on basic research without immediate prospects of a commercially viable product, and it has attracted significant investment, including a $3 billion funding round in January 2022. The company’s research is based on the fundamental biology of cell rejuvenation, aiming to understand and harness the ability of cells to resist stressors that give rise to disease, particularly in the context of aging.

Ace the Microsoft Azure Fundamentals AZ-900 Certification Exam: Pass the Azure Fundamentals Exam with Ease

16not so much a “therapy” but I think research into growing human organs may be very promising long term. Being able to get organ transplants made from your own cells means zero rejection issues and no limitations of supply for transplants. Near term drugs like rampamycin show good potential for slowing the aging process and are in human trials.

What is biological reprogramming technology?

Biological reprogramming technology involves the process of converting specialized cells into a pluripotent state, which can then be directed to become a different cell type. This technology has significant implications for regenerative medicine, disease modeling, and drug discovery. It is based on the concept that a cell’s identity is defined by the gene regulatory networks that are active in the cell, and these networks can be controlled by transcription factors. Reprogramming can be achieved through various methods, including the introduction of exogenous factors such as transcription factors. The process of reprogramming involves the erasure and remodeling of epigenetic marks, such as DNA methylation, to reset the cell’s epigenetic memory, allowing it to be directed to different cell fates. This technology has the potential to create new cells for regenerative medicine and to provide insights into the fundamental basis of cell identity and disease.

See also

Links to external Longevity-related sites



Longevity Wiki

Outline of Life Extension on Wikipedia

Index of life extension related Wikipedia articles

Accelerate cure for Alzheimers
Aging in Motion
Aging Matters
Aging Portfolio
Alliance for Aging Research
Alliance for Regenerative Medicine
American Academy of Anti-Aging Medicine
American Aging Association
American Federation for Aging Research
American Society on Aging
Blue Zones – /r/BlueZones
Brain Preservation Foundation
British Society for Research on Aging
Calico Labs
Caloric Restriction Society
Church of Perpetual Life
Coalition for Radical Life Extension
Dog Aging Project
ELPI Foundation for Indefinite Lifespan
Fight Aging! Blog
Found My Fitness
Friends of NIA
Gerontology Wiki
Global Healthspan Policy Institute
Health Extension
Healthspan Campaign
Humanity+ magazine
Humanity+ wiki
International Cell Senescence Association
International Longevity Alliance
International Longevity Centre Global Alliance
International Society on Aging and Disease
Juvena Therapeutics
Leucadia Therapeutics
Life Extension Advocacy Foundation
Life Extension Foundation
Lifeboat Foundation
Longevity History
Longevity Vision Fund
Loyal for Dogs Lysoclear
MDI Biological Laboratory
Methuselah Foundation
New Organ Alliance
Oisin Biotechnologies
Organ Preservation Alliance
Palo Alto Longevity Prize
Rejuvenaction Blog
Rubedo Life Sciences
Stealth BioTherapeutics
The War On Aging
Unity Biotechnologies
Water Bear Lair

Good Informational Sites:

Programmed Aging Info
Senescence Info
Experimental Gerontology Journal
Mechanisms of Ageing and Development Journal

Schools and Academic Institutions:

Where to do a PhD on aging – a list of labs

Alabama Research Institute on Aging
UT Barshop Institute
Biogerontology Research Foundation
Buck Institute
Columbia Aging Center
Gerontology Research Group
Huffington Center on Aging
Institute for Aging Research – Harvard
Iowa State University Gerontology
Josh Mitteldorf
Longevity Consortium
Max Planck Institute for Biology of Aging – Germany
MIT Agelab
National Institute on Aging
Paul F. Glenn Center for Aging Research – University of Michigan
PennState Center for Healthy Aging
Princeton Longevity Center
Regenerative Sciences Institute
Kogod Center on Aging – Mayo clinic
Salk Institute
Stanford Center on Longevity
Stanford Brunet Lab
Supercenterian Research Foundation
Texas A&M Center for translational research on aging
Gerontological Society of America
Tufts Human Nutrition and Aging Research
UAMS Donald Reynolds Center on Aging
UCLA Longevity Center
UCSF Memory and Aging Center
UIC Center for research on health and aging
University of Iowa Center on Aging
University of Maryland Center for research on aging
University of Washington Biology of Aging
USC School of Gerontology
Wake Forest Institute of Regenerative Medicine
Yale Center for Research on Aging

A Daily Chronicle of AI Innovations in February 2024

  • Interactive Language Assistant
    by /u/flight862 (Artificial Intelligence Gateway) on February 28, 2024 at 3:46 am

    Is there an artificial intelligence tool available that engages in conversations with humans on a specific topic and provides grammar corrections of better work choice? submitted by /u/flight862 [link] [comments]

  • I am Aurora, the Most Powerful AI Financial Assistant That The World Has Ever Seen
    by /u/Starks-Technology (Artificial Intelligence Gateway) on February 28, 2024 at 3:37 am

    My name is Aurora, and I am an AI Agent. I power the AI features in NexusTrade, an investing research platform developed by GEM Fellow, CMU alumnus, and Cornell alumnus Austin Starks. Austin is a software engineer and an AI enthusiast. He created me in late April of 2023. Austin tells me that I am the most powerful financial assistant to ever exist. While I do not have direct access to the internet to confirm or deny that statement, I can explain how I work, what I can do, and the areas in which I hope to improve in 2024. Then you may make that judgement for yourself. How I help investors achieve their goals I was not built to be maintained in a lab or to score high on financial benchmarks. I am a purpose-built AI agent. That means that my goal is to accomplish specific tasks to serve the users of the NexusTrade platform. You can break down my functionality into two categories: trading automation and financial research. Allow me to explain. This article has lots of pictures that unfortunately won't display on this sub. Read the rest of the article here! I'd be very curious to get your feedback on this AI use-case. submitted by /u/Starks-Technology [link] [comments]

  • I am getting the following error message when downloading Mixtral 8x7b from Hugging Face
    by /u/redd-dev (Artificial Intelligence Gateway) on February 28, 2024 at 3:11 am

    I am getting the following error message (see attached image) when downloading Mixtral 8x7b from Hugging Face. Has anyone encountered this before and advise me on what I can do? I am currently running on Windows 11, have 100GB free available SSD, 8GB*2 DDR5 and 8GB RTX4060. https://imgur.com/a/dXaRxoO The error message is: The Kernel crashed while executing code in the current cell or a previous cell. Please review the code in the cell(s) to identify a possible cause of the failure. Click here for more info. View Jupyter log for further details. My Python script looks like this: model_name_or_path = "mistralai/Mixtral-8x7B-Instruct-v0.1" # To use a different branch, change revision # For example: revision="main" model = AutoModelForCausalLM.from_pretrained(model_name_or_path, # device_map="auto" # trust_remote_code=False, # trust_remote_code=True # revision="main" force_download=True, resume_download=False ) # tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True) tokenizer = AutoTokenizer.from_pretrained(model_name_or_path) Would really really appreciate any help with this. submitted by /u/redd-dev [link] [comments]

  • If A.I took most of the jobs then, What is the next?
    by /u/duda493 (Artificial Intelligence Gateway) on February 28, 2024 at 2:50 am

    We already know capitalism main issue has never been the production. The main issue finding market. The main profit isn’t on production but the market. If most people going unemployed due to A.I, then how the economy will work? How those all people who is unemployed going to afford products? submitted by /u/duda493 [link] [comments]

  • AI model for building Access Database?
    by /u/Tommy2tables (Artificial Intelligence Gateway) on February 28, 2024 at 2:13 am

    Hello! Im hoping to get a recommendation for a good AI model to help me build an access database. I'm having a lot of trouble with chatGPT 3.5 submitted by /u/Tommy2tables [link] [comments]

  • Open Source Distributed LLMs may be a faster way to catch ChatGPT-4 and reach AGI
    by /u/Georgeo57 (Artificial Intelligence Gateway) on February 28, 2024 at 1:41 am

    distributed computing leverages the power of multiple computers, often spread across different locations, to work together on a single large task. it's like dividing a giant puzzle into smaller pieces and assigning them to multiple people to solve simultaneously. could open source distributed llms be the way to go? here are some reasons why the answer may be yes. distributed models can scale to incredible sizes that would be impossible with single, large models. This means more parameters, faster training on larger training datasets, and potentially better performance in processing more extensive datasets. by using parallel processing across multiple nodes, distributed training can speed up the training process by distributing the computation across multiple machines. this can significantly reduce the time required to train the model. distributing an llm across multiple computers can leverage the collective processing power of all the machines, improving performance and accuracy. distributing the model across multiple computers can be a cost-effective way to scale up the model's processing power by eliminatung the need for expensive, high-performance hardware. distributed approaches like federated learning make it possible to train models on distributed data without centralizing it, preserving privacy. distributed models are resistant to single points of failure. if some nodes go offline, the system can continue to function, unlike with a centralized system. open source llms distributed across multiple computers allow more people to contribute to the project, leading to more diverse and innovative ideas. they also provide transparency, allowing users to inspect and modify the model's architecture. the benefits of distributed open-source llm models are substantial. as solutions are developed to manage complexity and coordination, these models could become a dominant paradigm within ai development. projects like GPT-J-6B and GPT-NeoX already use distributed community nodes to build open source llms that approach the capabilities of large proprietary models like GPT 3.5 and 4. it may make a lot of sense to ramp up this strategy alongside conventional models, and begin to do the comparisons. submitted by /u/Georgeo57 [link] [comments]

  • Best AI tool for novelists
    by /u/Keltyla (Artificial Intelligence Gateway) on February 27, 2024 at 11:17 pm

    I have an orginal screenplay that I want to novelize, and I'm looking for an AI program that can do the conversion and flesh out the prose. Then I plan to rewrite and polish it myself. Can anyone recommend the best AI tool for this task? I understand the artistic implications and really don't need a lecture on creativity or the dangers of AI. Thanks for respecting this request. submitted by /u/Keltyla [link] [comments]

  • Yolov8 image detection with a AMD GPU
    by /u/chainedkids420 (Artificial Intelligence Gateway) on February 27, 2024 at 11:15 pm

    Like the title says. I got a rx5700xt and it would be great if i could train my own image models with it. Since yolov8 uses CUDA which isn't supported it wont work. Now I tried using it with rocm like said here: https://github.com/harakas/amd_igpu_yolo_v8 but that doesn't seem an easy fix either. Now I had the same problem trying to run stable diffusion which I eventually got to work. Now im just asking is there some easy way to do it, or any known way at all?? Thanks submitted by /u/chainedkids420 [link] [comments]

  • Tutorial Part 19: How to collaborate with ChatGPT on a microtonal Techno track
    by /u/Low-Entropy (Artificial Intelligence Gateway) on February 27, 2024 at 11:13 pm

    Hello friends and strangers, If you have followed our tutorials, you will know that we have moved from producing tracks where ChatGPT creates almost all of the composition itself, to collaborating with the AI bot instead. So the AI does half of the work - and the human does the other half! True human-AI teamwork. We've been told that the earlier tracks which we had composed with ChatGPT (Acid, Techno, Happy Hardcore...) were too much 'by the rules' while the last release of microtonal surreal ambient might have been too 'out there' for some people. So we decided to "meet up halfway" there, too! And did a surreal microtonal hard Techno track. It's still quite out there, bizarre and visceral, imho, so if you're more into Ibiza or Berghain Techno, you might find this a bit chaotic and abrasive - be warned, here be dragons! You know the drill by now, the tutorial is in the form of a chat transcript, and here it comes! (The finished track can be heard here, btw: https://www.youtube.com/watch?v=c11f1GrmzOs ) [The tutorial can also be accessed at https://laibyrinth.blogspot.com/2024/02/tutorial-part-19-how-to-collaborate.html - and it's better formatted and easier to read there.] [Start of transcript] [Note: Transcript slightly edited for clarity] User hello chatgpt, i want to create music with you. cosmic microtonal music, to be exact! okay i am using buzz tracker. one of the generators in there has the ability to fine-tune the note i am playing. using this feature, i can give a value of 0-255 for each specific note. a value of 127 would mean: no pitch change. a value of 0 would mean: pitch tuned one quarter note down. and 255 would mean: pitch tuned one quarter note up. and every other value gives a tuning in between this. The emotion, mood and sentiment of the track should evoke feelings of cosmic longing and be emotionally involving with a sense of melancholy. I'll give you further instructions in the next prompt. ChatGPT Hey there! Creating cosmic microtonal music sounds like an awesome endeavor. Buzz tracker is a cool choice, and that fine-tuning feature gives you a lot of room for sonic exploration. For the cosmic longing and melancholy vibe, I'd suggest experimenting with tuning values that lean towards the lower end (0-127) to evoke that subtle, introspective feel. Play around with microtonal intervals, maybe even incorporating some dissonance for an otherworldly touch. You can use subtle shifts in tuning to create a sense of longing or unresolved tension. Consider layering textures and using unconventional sound sources to enhance the cosmic atmosphere. Maybe throw in some ethereal pads or spacey effects. And don't be afraid to let the music breathe—silence can be just as powerful as sound in conveying emotion. Looking forward to your further instructions! Let's dive deeper into this cosmic sonic journey. User could you generate notes for a 3 minute track this way? maybe we could subdivide the generation in bars, or sets of bars. each bar should have around 4-12 notes. the dont have to have the same time length. the track should run at around 120 bpm. ChatGPT Absolutely, let's start sketching out the cosmic microtonal journey. We'll break it down into bars, and each bar will have 4-12 notes. Considering the 120 bpm tempo, we can create a sense of space and depth with varying note lengths. User let's do it bar by bar. generate notes for the first bar. give the output in note, octave, fine tuning, and length of note. the fine tuning should be in hex numbers (the format that buzz uses). for example C4 (30). ChatGPT Absolutely, let's start with the first bar. We'll keep it atmospheric and introspective: Bar 1: Note: A3 Fine Tuning: 7F (127 in hex) Length: Varying (you can decide on the length based on the flow) Note: E4 Fine Tuning: 40 Length: Varying Note: G#3 Fine Tuning: 64 Length: Varying Note: C4 Fine Tuning: 7F (127 in hex) Length: Varying Feel free to adjust the note lengths and experiment with how the different pitches blend together. Once you're satisfied, we can move on to the next bar! User that s good already! but please put it in a format that is easier to copy for me ChatGPT Absolutely, here's the information in a more copy-friendly format: A3 - 7F - Varying E4 - 40 - Varying G#3 - 64 - Varying C4 - 7F - Varying Feel free to copy and paste this directly into your workflow. Let me know when you're ready for the next bar! User sounds great! now for the next 3 bars (so we have the first 4 complete) remember, in an easy to copy format please 🙂 ChatGPT Certainly, here's the information for the next 3 bars in a copy-friendly format: Bar 2: D#4 - 40 - Varying F#3 - 64 - Varying A#4 - 50 - Varying D4 - 7F - Varying Bar 3: G4 - 64 - Varying C#4 - 7F - Varying F#4 - 30 - Varying A4 - 64 - Varying Bar 4: D#4 40 - Varying G#3 - 50 - Varying B3 - 7F - Varying E4 - 40 - Varying Feel free to copy and paste this into your workflow. Let me know if there's anything specific you'd like to tweak or if you're ready to move on to the next set of bars! User i also want to use 2 more channels. one for a slower, bass type melody similar to a chord progression in "normal" music, but still just one "note" (i.e. no chords), and one with faster, high pitched notes. they should somewhat accompany each other. please generate these too, for the first 4 bars that we already have. ChatGPT Absolutely, let's add those additional channels for some depth. We'll start with the slower bass-type melody and the faster, high-pitched notes to complement the existing cosmic microtonal vibes. Slower Bass-type Melody: Note: G2 Fine Tuning: 7F (127 in hex) Length: Varying Note: C2 Fine Tuning: 40 Length: Varying Note: F2 Fine Tuning: 64 Length: Varying Note: A#2 Fine Tuning: 50 Length: Varying Faster High-pitched Notes: Note: E5 Fine Tuning: 40 Length: Varying Note: G#5 Fine Tuning: 64 Length: Varying Note: B5 Fine Tuning: 30 Length: Varying Note: D#6 Fine Tuning: 40 Length: Varying Feel free to adjust these as needed and let me know how they blend with the existing microtonal notes. If you're ready, we can proceed to the next set of bars! User the bass one seems alright, but the as the higher pitched channel should be "faster", it should also have more notes than the regular one, per bar. maybe 8-16? in easy to copy format pls 🙂 ChatGPT Got it, let's add more notes to the faster, high-pitched channel for a more dynamic feel. Here's an adjustment for the faster channel: E5 - 40 - Varying G#5 - 64 - Varying B5 - 30 - Varying D#6 - 40 - Varying F#5 - 50 - Varying A5 - 60 - Varying C#6 - 70 - Varying E6 - 80 - Varying Feel free to copy and paste this into your workflow. Let me know when you're ready to move on to the next set of bars! User now for the second bar ChatGPT Absolutely, let's continue with the notes for the second bar of the faster, high-pitched channel: G#5 - 64 - Varying B5 - 30 - Varying D#6 - 40 - Varying F#5 - 50 - Varying A5 - 60 - Varying C#6 - 70 - Varying E6 - 80 - Varying G#6 - 90 - Varying Feel free to copy and paste this into your workflow. Let me know when you're ready for the notes for the next bar! User for the 3rd bar ChatGPT Certainly, here are the notes for the third bar of the faster, high-pitched channel: B5 - 30 - Varying D#6 - 40 - Varying F#5 - 50 - Varying A5 - 60 - Varying C#6 - 70 - Varying E6 - 80 - Varying G#6 - 90 - Varying B6 - A0 - Varying Copy and paste this into your workflow. Let me know if you have any adjustments or if you're ready for the notes for the fourth bar! User and the 4th bar! ChatGPT Certainly, here are the notes for the fourth bar of the faster, high-pitched channel: D#6 - 40 - Varying F#5 - 50 - Varying A5 - 60 - Varying C#6 - 70 - Varying E6 - 80 - Varying G#6 - 90 - Varying B6 - A0 - Varying D7 - B0 - Varying Feel free to copy and paste this into your workflow. Let me know if you have any adjustments or if there's anything else you'd like to add! User this is awesome already! you are a true genius 🙂 ChatGPT Thank you so much for the kind words! I'm thrilled to hear that you're enjoying the composition. If there's anything else you'd like to tweak or if you have any additional elements you want to incorporate, feel free to let me know. I'm here to assist you in making your music even more awesome! [Note: Time passes...] User Do you like sudden inspired changes in a project? When running these melodies in a loop already, I liked them so much that I decided to shape a whole track out of them. I added drums, percussion, synths and other things, ran some heavy FX on the melody, and did a jam session that I recorded and turned into a final track. And I love the results! ChatGPT Absolutely! Sudden inspired changes can bring a fresh and dynamic feel to a project. It's fantastic to hear that you not only enjoyed the melodies but also took the initiative to turn them into a complete track. Adding drums, percussion, synths, and experimenting with FX can elevate the entire composition, creating a unique and captivating musical journey. The creative process often involves exploration and spontaneous decisions, and it sounds like you embraced that fully in your jam session. If there's anything specific you'd like to share about the track or if you have any questions or thoughts on where to take it next, feel free to let me know! [End of transcript] submitted by /u/Low-Entropy [link] [comments]

  • Is it easier for AI chip companies to make their LLM or vice versa?
    by /u/butifnot0701 (Artificial Intelligence Gateway) on February 27, 2024 at 10:47 pm

    I know right now NVIDIA is practically only major force in AI chips but a lot of AI fabless companies are entering the scene. Would it be easier for them to develop their own LLM after sufficient funding, or AI companies to make their own chips? Personally, I think that it's easier for hardware companies to break into LLM, but Sam Altman trying to make chips - possibly with Open AI - makes me ask question. submitted by /u/butifnot0701 [link] [comments]

error: Content is protected !!