Showing posts with label QbD. Show all posts
Showing posts with label QbD. Show all posts

Friday, July 2, 2021

Scale-up Suite 2 released! Plus news on Dynochem Biologics

It's been even busier than usual at Scale-up Systems recently and here's a catch-up in case you haven't seen it elsewhere yet.

We're delighted to have released Scale-up Suite 2 today (big thanks to Steve Hearn and all of our Development team!) and look forward to customers rolling this out over the next little while.  There are a host of enhancements as covered in our previous post and some new product options, notably RunScript Automation:

  • This exposes calls to the Dynochem and Reaction Lab APIs so that customers in our Digitalization Pilot Program can apply our scripts to create autonomous workflows and build their own scripts to implement high-throughput modeling alongside experimentation. Check out a preview video here [5 minutes].
Dynochem 6 has been released as a component of Scale-up Suite 2.

We've also separated the Dynochem model library for biologics into its own dedicated Resources site and created a new product option around this functionality:

  • It's called Dynochem Biologics and is focused on the development and manufacturing of large molecule therapeutics using cell culture (USP), downstream purification (DSP) and fill/finish operations.  
  • We've built in all of the usual supports you expect from us, including self-paced training and KB articles.  
  • Dynochem Biologics already works with Scale-up Suite 1 and we'll be moving all biologics users across to this platform at the end of July.

Actions for you:

  • Visit our Installer page to upgrade to Scale-up Suite 2 or have your IT colleagues do so if that's necessary in your organization.  Remember, our cloud licensing means you don't need any on-premise license server and Suite 2 works immediately with the cloud licenses (and content) you already use.
  • Contact your Scale-up Systems representative to find out more about our Digitalization Pilot Program or to explore Dynochem Biologics.

Saturday, April 24, 2021

Get ready for Dynochem 6 and Scale-up Suite 2: Modeling for Everyone

Last week, Peter Clark gave a preview of new features coming with Scale-up Suite 2.  If you missed the event live, as usual you can catch the recoding in the resources site here.

Peter showed there is something for everyone in the new release.  Whatever modality of drug/ active ingredient you develop or make, whether a small or large molecule or somewhere in between, whether made with cell culture or synthetic organic chemistry, your teams and your enterprise can obtain value daily from our tools.  That takes us several steps closer to our vision to positively impact development of every potential medicine.

Scale-up Suite 2 includes:

  • Powerful equipment calculators for scale-up, scale-down and tech transfer, leveraging our industry standard vessel database format
  • Rigorous material properties calculators for pure components, mixtures and your proprietary molecules
  • Empirical / machine learning tools, to build and use regression models from your data with just a few clicks; including support for DRSM
  • Mechanistic modeling of any unit operation, in user-friendly authoring and model development environments
  • Hybrid modeling, combining the best of both worlds
  • Interactive data visualization, including parallel coordinates and animated contour plots for multidimensional datasets
  • New features to make modeling faster, more productive and more enjoyable, incorporating ideas suggested by customers and from our own team
  • New capabilities for autonomous creation of models, parameter fitting and process optimization 'headless' on the fly, as well as incorporation of real time data and access from any device.
We believe that:
  • Interdisciplinary collaboration accelerates process development and innovation
  • Models facilitate collaboration and knowledge exchange
  • Interactive, real-time simulations save days and weeks of speculation
  • Models are documents with a lifecycle extending from discovery to patient
  • Model authoring tools must be convenient and easy to use
  • Teams needs models that are easily shared
  • Enterprises need tools that embed a modeling culture and support wide participation.
In other words, modeling should be an option for Everyone.  To make that a reality for you, we support our software tools with:

  • an Online Library, containing hundreds of templates, documentation and self-paced training
  • Free 1-hour on-line training events monthly
  • Half-day and full day options for face to face training, available globally
  • A free certification program to formally recognize your progress and skills
  • Outstanding user support from PhD qualified experts with experience supporting hundreds of projects like yours
  • A thriving user community, with round tables and regular customer presentations sharing knowledge and best practices.

We're celebrating 21 years serving the industry this year, supporting more than 20,000 user projects annually, for more than 100 customers all over the world, including 15 of the top 15 pharma companies.

If you're an industry, academic or regulatory practitioner, we invite you to join our user community and start to reap the benefits for your projects.

Wednesday, January 27, 2021

Dynochem biologics model library released

Many thanks to customers who engaged with Scale-up Systems as we "built, broke and bettered" our biologics model library in the run-up to release late last year.

More than one hundred biopharmaceutical companies in the Scale-up Suite global user community can now access the tools for immediate use here (https://dcresources.scale-up.com/?q=bio).  An overview of the biologics library is available here.

We expect each tool to grow and be refined by the repeated use that is typical of customer activity and we look forward to supporting more users in taking up the tools in their daily work.

Much like the small molecule opportunity, mechanistic modeling has great potential to accelerate the development of large molecules by shortening development time, making best use of experiments and anticipating manufacturing challenges.  Ours is the first fit-for-purpose and comprehensive mechanistic model library to be built and released in this space, another first of which we are very proud.

Using the Dynochem biologics library delivers daily benefits in development and scale-up while creating digital twins to support your digitalization strategy

Training opportunities using the new tools will be available at regular intervals this year.  Let us know if you'd like a dedicated session for your company or site.

Feel free to share this post with anyone you think may benefit.

Tuesday, December 1, 2020

Digital Tech Transfer using the Dynochem Vessel Database

The pharma industry practice of 'process fit', which allows the manufacture of most products by re-using existing physical assets, raises the perennial question of whether a given process running well at Lab A or Site B can also be run well at Site C.  Anyone who cooks or bakes even occasionally in their own kitchen will know that equipment dimensions and operating conditions affect product quality (and cycle time) and the same is true at manufacturing scale.

This problem used to be handled with a 'boots on the ground' approach, where extensive air travel and time on site allowed detailed oversight, some costly experimentation and tweaks locally before manufacturing.  With a large portion of manufacturing now contracted out to CDMOs, tech transfer remains challenging unless you have the right tools.

Working with over 100 companies engaged in the development and manufacture of pharmaceuticals, we get an up-close look at the issues, challenges and opportunities around tech transfer.  Probably the single biggest factor that makes our tools indispensable to accelerate this work is the Dynochem Vessel Database.

Users like to achieve 'equivalence' between equipment performance at the transferring and receiving sites.  Equivalence may sound simple but the different scaling laws that apply to mixing, heat transfer, solids suspension and mass transfer make this complex; and that's before even considering meso-mixing and micromixing.  Apparently inconsequential differences that are easy to miss, such as materials of construction, heat transfer fluids, impeller types, sizes and positions and even feed locations can have a large impact on performance at the receiving site.  

The likelihood of Right First Time tech transfer increases dramatically with a sufficiently detailed Vessel Database that accurately stores the configuration of site equipment.  Link that with the recipe of the target process, our equipment calculators and peer-reviewed physical properties from our Materials System and you can perform Digital Tech Transfer quickly and accurately without leaving your desk.

If you haven't already created the Vessel Database for your site or wider organization, you can start here from our template.  It's an ideal project for a young engineer and once done correctly, saves time for everyone on the team.

Selection of 'impeller' types in the Dynochem Vessel Database; users may also add custom impellers and internals

Friday, September 25, 2020

Inspired by the industry response

Many of our posts on the blog this year have been about the pandemic, predicting its course and interpreting reported data for cases and deaths.  

We have seen that population level Dynochem models have been sufficiently accurate to describe the data for each country and quantify the potential future impact of the outbreak as well as the effectiveness of non-pharmaceutical measures, such as lockdowns and the wearing of masks.  

Our models for the outbreak will remain available to the user community on our COVID site.  We do not plan to further develop or update the models for the foreseeable future.  

New content on the blog will return to our core focus, positively impacting the development of medicines by our customers, the global pharmaceutical industry.  

We are proud to serve the pharmaceutical industry, supporting daily core business activities at more than 100 organizations that develop or make medicines. The industry response to COVID-19 has been inspiring and no less than we expected, having worked with some of these companies for two decades.

All of our normal activities including software development, user support and training have continued in a fully operational state and we have seen increased activity from customers both using and learning the tools. Of course we are delivering all events on-line for now. Public training events are half price during the outbreak and we have been offering training licenses to customers delivering their own internal curriculum.

We have ramped up our own support for unit operations likely to be involved in manufacturing vaccines and treatments, including bioreactors and lyophilization.

We would be delighted to hear from members of the community anytime if you have ideas or suggestions as to how we could do more, by email to support@scale-up.com.

Reaction Lab in action
Our Reaction Lab software helps chemists develop kinetic models, maximize yield and minimize impurities.

Monday, February 3, 2020

2019 round-up and looking forward to 2020

To ardent watchers of this blog - you know who you are - apologies for the pause in postings.  A lot has been happening since August 2019 as we follow our mission to accelerate process development and positively impact the development of every potential medicine.

We'll be posting more regularly in 2020, with lots of news and new capabilities in the pipeline.  In the meantime, here's a catch-up on some items from the latter part of 2019 and a picture that summarizes a few of them:

A few Scale-up Systems and Scale-up Suite highlights from the end of 2019; more details below.
  • We presented our sponsored AIChE award for Outstanding Contribution to QbD for Drug Substance to the 2019 winner, Zoltan Nagy of Purdue University, at the Annual Meeting in Orlando [pic - top right]
  • We nominated John Peterson of GSK for the corresponding Pfizer-sponsored Drug Product award, for his excellent work on statistics of design space, and he won.  Was great to catch up with John at the awards session [pic - top left]
  • Andrew Bird presented statistically rigorous calculations of design space for three common unit operations; and ways to dramatically accelerate the calculations [pic - top centre]; watch for the details in a 2020 webinar
  • A growing band of Dynochem and Reaction Lab users are keeping warm this winter in our 'beanie' hats, helping the environment with our keep-cups and looking forward to summer in our poloshirts [pic - bottom centre]
  • And we've updated our certification programs to a fully automated on-line system with randomized questions.  Visit the Resources site and search for 'certified' to find out more and take the test.

Sunday, August 18, 2019

Part 6 of 6: Join us on the journey to 'Industry 4.0'

If you've been following this series on Industry 4.0, you'll know by now that multiple technologies maturing now or soon are creating significant opportunities for acceleration and transformation in chemical development and manufacturing.

If you already work with our company, you'll have seen the model libraries, software tools, training materials and user support that make everyday applications a reality.  If you haven't seen them yet, most likely your company is one of the over 100 with access - contact us if you cannot easily locate your internal champion.

Over 100 companies engaged in the development and commercialization of new medicines rely on our tools at over 400 sites worldwide
We're always listening for feedback and even since this series started have delivered updates in Scale-up Suite to extend support for automation applications, with parallel processing for more tasks and new function calls to return 'data cubes' as arrays in memory from large simulation sets.

We've received excellent feedback from our user base in the fast-growing CMO space, that leveraging our vessel database format for digital tech transfer is helping to reduce costs and failures, increase speed to make room for more customer projects, reduce deviations and modernize the approach to on-boarding new projects.

We hope you've enjoyed this series of six postings.  If you want to hear more, join us for our webinar on this topic in early September:

[Webinar] Opportunities to accelerate Chemical Development as part of "Industry 4.0"

Thursday, July 11, 2019

Part 5 of 6: Opportunities to accelerate projects

You may already know that the most commonly used noun in the English language is "time".  In today's world, many of us feel almost permanently under time pressure and we talk about not having enough time for all kinds of things we'd like to do.  Not having time takes on a whole new meaning for patients with life changing medical conditions, reminding us in chemical development and scale-up that opportunities to accelerate our work and commercialization of new medicines should be taken with both hands.

Achieving acceleration using modeling (e.g. Dynochem or Reaction Lab) is already well covered by extensive case studies from customers in Dynochem Resources.  Acceleration using automation of modeling and connection of modeling to other workflows is the subject of this post.  In our core software development team, we have thought a  lot about these future applications and taken steps to support their realization, providing a platform and the ‘hooks’ needed to link with other technologies.

A basic platform is the ability to automatically generate and run a large number of virtual experiments.  We use parallel processing to execute the simulations as illustrated in the short animation below.  The automation calls are exposed and may be 'scripted' and run by other programs (e.g. Python) as part of an integrated workflow.
Chemists and engineers can leverage automated generation and execution of a large set of virtual experiments with parallel processing and collation of results in convenient Excel tables and contour plots.
Tasks involved in model building may also be scripted/ automated in Dynochem 5 and Reaction Lab.  For example, area percent data may be entered in a model, a set of kinetic parameters fitted and many simulations carried out, all without human intervention.  To do this requires some scripting / code at several stages in the workflow.  Cloud computing resources (Azure or AWS) may be used for execution, leveraging our cloud licensing.

For example, the animation below shows scripted fitting of three UA (heat transfer characterization) values to three solvent tests using Dynochem 5.  This takes a short time to fit the parameters needed for each of three liquid levels in a reactor.  (The ‘fit’ button is just for demo purposes and normally the fit would be started from another scripted workflow process).
Scripted parameter fitting is possible using new function calls built into Dynochem 5 and Reaction Lab; this example illustrates automated heat transfer characterization (UA) and the techniques are equally applicable to e.g. chemical kinetics.
Additional opportunities exist in leveraging information from electronic lab notebooks (ELN) to create models for users that are already populated with features such as chemical structures and experimental data.  In a move beyond existing relatively crude self-optimizing reactor algorithms, customers are interested in closing the loop between modeling and experimentation, using model outputs to set up and execute the next experiment(s) in a fully automated loop.

Contact our support team if you'd like to discuss any of these applications further for use inside your organization.

Friday, June 7, 2019

Part 4 of 6: Where will the models come from?

If mechanistic modeling is to become a focal point in the project lifecycle, you have to address the question of where the models will come from.  In this context, by 'model' we mean i) the set of equations to be solved, ii) in executable form, with iii) initial values, iv) fitted parameter values where needed and v) experimental data to assess model accuracy.

Q: Who can create these models and when does it make sense for them to do so?
A: For tangible benefits the creators and users should be the same practitioners / project teams that own and run the development projects.  Not some specialists in an ivory tower that are focused only on modeling.  Model development should occur before and during experimentation.  Modeling should not be a 'post-processing' activity that occurs too late to add value or when the time window for data collection has passed.

In Dynochem 5 and Reaction Lab, we have streamlined the process in i) to v) so that this vision is achievable.  We include further notes on the individual steps below.

Steps i) to v) can be accomplished in a snap for chemical reactions using Reaction Lab.  The resulting model can be leveraged over and over during the project lifecycle.
Item (i) may be clear and simple for certain common unit operations like heating/ cooling and perhaps filtration; for many other operations, identifying which equations to solve may be iterative and challenging.  For models of fairly low complexity, like distillation, while the equation set may be obvious it is unwieldy to write down for multi-component systems including the energy balance.  For models of chemical reactions, the set of elementary reactions will not become clear until the full cycle i)-v) has been repeated more than once by knowledgable process chemists.

Unlike some other tools, we do not force users to populate 'matrices' just to define reactions and reaction orders (!)

Item ii) is an obstacle for practitioners who only have access to spreadsheets, or specialized computing/coding environments.  These force the user to develop or select a specific solution method and run risks of significant numerical integration inaccuracies.  Even then, simulations will lack interactivity and parameter estimations will require scripting or complex code.  Some 'high-end' engineering software tools present similar challenges, lacking comprehensive model libraries and forcing users to write custom models, delve into solution algorithms and confront challenges such as 'convergence' that feel highly tangential to project goals.

Item iii) should be easy for practitioners and in practice it can be so, if the software supports flexible units conversion (in and out of SI units) and contains supporting tools to provide initial estimates of physical properties and equipment characteristics.

Item iv) requires the model to be run many times and compared with experimental results.  Specialized algorithms are needed to minimize the gap between model predictions and experimental data.  When multiple parameters must be fitted to multiple responses in multiple experiments, this gets close to impossible in a spreadsheet model and general-purpose mathematical software environments.

Item v) is mainly the province of the experimenter and once each experiment has been completed, requires an easy mechanism for aggregating the data, with flexible units handling (including HPLC Area, Area%) being a major help.

And so to answer the question in the title of this post: You guessed it!  We expect the majority of chemical reaction and unit operation models in Pharma to continue to be developed using our tools in preference to home-made or overly complex environments.  As the volume of modeling activity grows with Industry 4.0 and related developments, we already see this trend becoming more pronounced, with many practitioners needing to use the same model over a project lifecycle, requiring speed and ease of use as well as accuracy and rigour.

Friday, May 10, 2019

Post 3 of 6: Central role of mechanistic modeling in Chemical Development

Chemical Development is a complex and challenging undertaking, involving a large effort from multi-disciplinary teams, sometimes battling Mother Nature, with compressed timelines and limited material for experimentation.  There is a broad spectrum of approaches to this challenge, including new lab instruments, use of robotics and automation, outsourcing certain types of development or operations and use of statistical and mechanistic modeling.  Companies also experiment to find the best organization structure for this function and frequently separate departments specialize in Analytical (Chemistry) Development, Chemical (Process) Development, Technology Transfer and preparation of Regulatory filings.  Collaboration among these groups helps achieve development goals.

Figure 1 (click to enlarge): A simplified representation of chemical development today, including the scale and locus of statistical and mechanistic modeling
Figure 1 is a much simplified graphical representation of the activities involved.  There is a large reliance on experiments.  Groups involved in process definition and optimization are currently the main users of both statistical and mechanistic modeling.  Technology transfer increasingly involves working with external partners remotely.  Data search and gather, including data integrity reviews and preparation of regulatory filings, are mostly manual processes.  The disparate nature of activities and the needs for specialization make them somewhat siloed, with risks of duplication and dilution of effort.  For example, an experimental program may be repeated if the first program missed some key information; or repeated by a CRO to answer new questions that have arisen; or repeated by a CMO in order to accomplish successful tech transfer.  None of these data may be harnessed effectively and shared to answer future questions.

Leading companies are changing their approach to chemical development and bringing mechanistic process modeling on stream earlier and more centrally than before.  The idea is not new but advances in a range of technologies (see earlier posts) and the momentum of 'Industry 4.0' are helping to fuel the transformation.  At a task level, using a model to design the right experiments reduces overall effort.  At a project level, the model provides a place to capture the knowledge and reuse it in future.  At an organization level, modeling provides a structured, reusable and digital approach to information sharing and retrieval.  For example, questions can be answered in real time, without experimentation, interactively when they arise, even live in a meeting or webcon, sparing delays, speculation and doubts, allowing faster progress.

Figure 2 (click to enlarge): Future shape of chemical development activities, with mechanistic process models as the focal point for information capture and reuse.
The pieces in Figure 1 are rearranged in a natural way in Figure 2 as a cycle that captures and makes the most of information generated during each chemical development activity, including modeling.  Additional items have been added to reflect technologies that are relatively new to Pharma, including continuous manufacturing and feedback process control; opportunities to apply either or both of these in chemical development or full scale manufacturing can be evaluated using a mechanistic process model.  Therefore the mechanistic model takes up a central position and is the focal point in the new chemical development process.

It will take some time before Figure 2 reaches its full potential.  The throughput of models in chemical development organizations is already increasing as model building tools become easier to use and more prevalent.  We're delighted to be able to lead the way with Scale-up Suite.

Figure 2 also includes some great opportunities to automate workflows.  We'll discuss some of these in a later post.  

Wednesday, May 1, 2019

Post 2 of 6: A brief history

The Wall Street Journal ran an article in September 2003, entitled "New Prescription For Drug Makers: Update the Plants", comparing and contrasting pharma manufacturing techniques with other industries.  The subtitle ran, perhaps unfairly, "After Years of Neglect, Industry Focuses On Manufacturing; FDA Acts as a Catalyst".

Our DynoChem software entered the industry a few years prior, the prototype having been developed as a dynamic simulator within Zeneca, so that users could "create a dynamic model without having to write differential equations".  We first proved that the software could be used to solve process development and manufacturing problems (e.g. with hydrogenations, exothermic additions), then rewrote the source code and began to add features that made modeling by non-specialists an everyday reality.

There have been many pharma industry leaders who have recognized the potential for modeling to help modernize development and manufacturing.  One example is Dr Paul McKenzie and his leadership team at Bristol-Myers Squibb (BMS) at the time, who cited the Wall Street Journal piece in an invited AIChEJ Perspectives article and also in presentations like this one at the Council for Chemical Research (CCR) in December 2005 - you can get the full slide deck here.

Cover slide from presentation by Paul McKenzie of BMS at CCR Workshop on Process Analytical Technology (PAT), December 13, 2005, Rockville, MD
Today, while the landscape for data storage, sharing and visualization has moved ahead significantly, with the emergence of ELN, cloud and mobile, the chemical and engineering fundamentals of defining and executing a good manufacturing process remain the same:

Some capabilities required to develop robust and scalable processes, from the 2005 CCR presentation
Our Scale-up Suite extends these capabilities to more than 100 pharma development and manufacturing organizations worldwide, including 15 of the top 15 pharmaceutical companies.  This broad and growing base of users, armed with clean and modern user interfaces, calculation power and speed in Reaction Lab and Dynochem 5, provides a firm foundation for the next wave of industry transformation.

We're always delighted to hear what users think.  Here are some recent quotes you may not have seen yet:

  • "If you can book a flight on-line, you can use Dynochem utilities" [we like this especially because we hear that using some other tools is like learning to fly a plane]
  • "Our chemists are thoroughly enjoying the capabilities of Reaction Lab software and are quite thrilled with the tool".

In the next post, we will look at the increasingly central role of mechanistic modeling in process development.

Monday, April 29, 2019

Post 1 of 6: Exciting times in Chemical Development

It's an exciting time to be part of the Pharma industry's chemical development ecosystem, with new opportunities being created and adopted to accelerate development of new medicines.  This is the first in a short series of posts that will focus on the role of predictive, mechanistic modeling in the industry's transformation.

The much-talked about 'Industry 4.0' phenomenon has led to the creation of awkward terms such as 'digitalization' and one positive consequence of the hype is that it has somewhat aligned the goals of senior managers, systems integrators, consulting companies and industry vendors.  We especially liked the review by Deloitte that uses the term 'exponential technologies' to group many of the developments that underpin current transformation opportunities:

Snapshot of exponential technologies covered in the Deloitte study, Exponential Technologies in Manufacturing
We'll be highlighting the role of digital design, simulation & integration, technologies that our customers have practiced on a growing scale for nearly twenty years.  We expect the rate of growth to increase quite sharply as new developments, like Reaction Lab, make adoption easier and simulation is integrated with other developing technologies.

If the above whets your appetite, watch this space for the next piece in this series.

As always, customers can contact our support team to discuss immediate applications.

Friday, February 22, 2019

Dynochem 5 released as part of the new Scale-up Suite

In our 14 February webinar, Scale-up Systems was delighted to announce the release of Dynochem 5 as part of the new Scale-up Suite, which also includes Reaction Lab and Numero Chem.
Scale-up Suite includes Dynochem and new products Reaction Lab and Numero Chem
Reaction Lab: Kinetics meets ELN

This is the culmination of great work by our software development team, inspired by customer feedback and led by Dr Steve Hearn.

High-level details about the components in Scale-up Suite can be found at the new look scale-up.com website.  Members of scale-up.com can get more detail and access to the tools via the Dynochem Resources and Reaction Lab Resources websites.

We've started a program of weekly 30-minute webinars to talk through the new features and hope that customers and prospective customers can make some of those live (or watch the recordings) over the next month or two.

Your Dynochem 4 content will work in Dynochem 5 and you should plan to upgrade as soon as practicable for you.  Expect a host of improvements in speed, ease of use and accuracy, the latter especially for material properties.

Use the links at the side of this blog to explore more.  As always, we'd love to hear your feedback to support@scale-up.com.

Thursday, July 19, 2018

Great set of guest webinars so far this year, more to come, including 'Bourne' sequel; enjoy on your phone

We hope you've been enjoying our free to attend guest webinar program this year as much as we have.

To date in 2018, Syngenta, Johnson Matthey, Nalas, Amgen and Teva have covered topics from one end of a manufacturing stage to the other, addressing synthesis, experimental design, process safety, crystallization and drying.

Who needs Netflix and HBO?  You can enjoy last week's Guest Webinar by Tom Corrie, Syngenta: “Accelerating Active Ingredient Development with Early Stage DynoChem Simulations", and all other webinars in our Guest series, on your smartphone / mobile device, any time of the day or night. [Screenshot from iPhone 8 shown here]

A reminder that you use your phone to both attend live (Adobe Connect app) and/or enjoy recordings (MP4 format, see iPhone screenshot above).  In line with the spirit of GDPR regulations, the identities of our attendees are now anonymized in recordings.

We're impressed by the innovative ways in which users apply our tools and also their openness in discussing process development challenges they face and the solutions they have found.  And there's more to come this year, with Sarafinas Process & Mixing Consulting on use of the legendary 'Bourne Reactions', UCD on continuous crystallization, and AstraZeneca on centrifugation, events all in the schedule.

Thanks to Steve Cropper and Peter Clark of our team for continuing to line up a great annual program.  2019 is already looking good, with Flow Chemistry and Drying webinars already planned.

Thursday, March 29, 2018

BioPharma Europe Initiative: giving Pharma manufacturing a distinctive voice in Brussels

Through our involvement with the pre-competitive collaboration centre SSPC, we have attended a number of events organized by BioPharma Europe, a growing SSPC-led initiative that is raising awareness in the European Parliament and Commission of the unique role, position and needs of the European Pharma industry and seeking policy initiatives that support a strong future for Pharmaceutical Manufacturing in Europe.

Source: EFPIA
After a good start, this group is now seeking to build support from a wider network of European pharma industry stakeholders in the next phase of discussion with Europe's research and regulatory policymakers, such that future policy decisions support this strategically important industry in the globally competitive landscape.

At Scale-up Systems, we are proud of our excellent relationships with pharma companies, CMOs and CROs and are delighted to bring BioPharma Europe to the attention of our customers.  Organizations wishing to find out more about BioPharma Europe should contact Aisling Arthur at SSPC.

Tuesday, February 27, 2018

A PSD trend that is not widely reported - thanks, Orel

While supporting customers who apply DynoChem for crystallization modeling, we have seen several cases where some of the familiar quantiles of the PSD (D10, D50, D90) reduce with time during at least the initial part of the crystallization process.

On reflection one should not be that surprised: these are statistics rather than the sizes of any individual particles.  In fact, all particles may be getting larger but the weighting of the PSD shifts towards smaller sizes (where particles are more numerous, even without nucleation) and in certain cases, this causes D90, D50 and maybe even D10 to reduce during growth.

Last week we had an excellent Guest Webinar from Orel Mizrahi of Teva and Ariel University, who characterized a system with this behaviour, with modeling work summarised in the screenshot below.

D10, D50 and D90 trends in a seeded cooling crystallization: measured data (symbols) and model predictions (curves).
There was a good discussion of these results during Orel's webinar and we decided to make a short animation of a similar system using results from the DynoChem Crystallization Toolbox to help illustrate the effect.
Cumulative PSD from the DynoChem Crystallization toolbox, showing the evolution of PSD shape during growth from a wide seed PSD.  The movement of quantiles D10, D50 and D90 is shown in the lines dropped to the size axis of the curve.
In this illustration, the reduction in  D50 can be seen briefly and the reduction in D90 continues through most of the process.  From the changing shape of the curve,  with most of the movement on the left hand side, most of the mass is deposited on the (much more numerous) smaller particles.

We see this trend even in growth-dominated systems, when the seed PSD is wide.

Wednesday, January 24, 2018

Run typical crystallization experimental design in silico using DynoChem

Faced with challenging timelines for crystallization process development, practitioners typically find themselves running a DOE (statistical design of experiments) and measuring end-point results to see what factors most affect the outcome (often PSD, D10, D50, D90, span).  Thermodynamic, scale-independent effects (like solubility) may be muddled with scale-dependent kinetic effects (like seed temperature and cooling rate or time) in these studies, making results harder to generalize and scale.

First-principles models of crystallization may never be quantitatively perfect - the phenomena are complex and measurement data are limited - but even a semi-quantitative first-principles kinetic model can inform and guide experimentation in a way that DOE or trial and error experimentation can not, leading to a reduction in overall effort and a gain in process understanding, as long as the model is easy to build.

Scale-up predictions for crystallization are often based on maintaining similar agitation and power per unit mass (or volume) is a typical check, even if the geometry on scale is very different to the lab.  A first principles approach considers additional factors such as whether the solids are fully suspended or over-agitated, how well the heat transfer surface can remove heat and the mixing time associated with the incoming antisolvent feed.

The DynoChem crystallization library and the associated online training exercises and utilities show how to integrate all of these factors by designing focused experiments and making quick calculations  to obtain separately thermodynamic, kinetic and vessel performance data before integrating these to both optimize and scale process performance.

Users can easily perform an automated in-silico version of the typical lab DOE in minutes, with 'virtual experiments' reflecting performance of the scaled-up process.  Even if the results are not fully quantitative, users learn about the sensitivities and robustness of their process as well as its scale-dependence.  This heightened awareness alone may be sufficient to resolve problems that arise later in development and scale-up, in a calm and rational manner.  Some sample results of a virtual DOE are given below by way of example.

Heat-map of in-silico DOE at plant scale agitation conditions, showing the effects of four typical factors on D50
The largest D50 is obtained in this case with the highest seeding temperature,  lowest seed loading and longest addition (phase 1) time. Cooling time (phase 2) has a weak effect over the range considered.
Click here to learn how to apply these tools.

Thursday, December 21, 2017

Congratulations to Dr Jake Albrecht of BMS: Winner of AIChE QbD for Drug Substance Award, 2017

At AIChE Annual Meetings, Monday night is Awards night for the Pharma community, represented by PD2M.  This year in Minneapolis the award for Excellence in QbD for Drug Substance process development and scale-up went to Dr Jake Albrecht of Bristol-Myers Squibb.  Congratulations, Jake!


Winners are selected using a blinded judging panel selected by the Awards Chair, currently Bob Yule of GSK.  Awards criteria are:
  • Requires contributions to the state of the art in the public domain (e.g. presentations, articles, publications, best practices)
  • Winner may be in Industry, Academia, Regulatory or other relevant working environment
  • Winner may be from any nation, working at any location
  • There are no age or experience limits
  • Preference is given to work that features chemical engineering
Jake was nominated by colleagues for:
  • his innovative application of modeling methodologies and statistics to enable quality by design process development
  • including one of the most downloaded papers in Computers and Chemical Engineering (2012-2013), “Estimating reaction model parameter uncertainty with Markov Chain Monte Carlo
  • his leadership and exemplary efforts to promote increasing adoption of modeling and statistical approaches by scientists within BMS and without
  • his leadership in AIChE/PD2M through presentations, chairing meeting sessions, leading annual meeting programming and serving on the PD2M Steering Team
Scale-up Systems was delighted to be involved at the AIChE Annual Meeting this year in our continued sponsorship of this prize.  Some photos and video from the night made it onto our facebook page and more should appear soon on the PD2M website.

Jake is also a DynoChem power user and delivered a guest webinar in 2013 on connecting DynoChem to other programs, such as MatLab.

Wednesday, November 29, 2017

November 2017 DynoChem Crystallization Toolbox Upgrade

We're delighted that the number of DynoChem users getting value from our crystallization tools continues to grow strongly and we're grateful for the feedback and feature requests they provide to help us improve the tools.

New features released this November include:
  • One-click conversion of kinetic model into predictor of the shape of the PSD
  • High-resolution tracking of the distribution shape, to minimize error*
  • Extended reporting and plotting of PSD shape.

Sometimes practitioners that are unaware of crystallization fundamentals, crystallize too fast and with little attention to the rate of desupersaturation.  For such a rushed process, even when seeded (2%) the operating lines might look like the picture on the left below (Figure 1). A more experienced practitioner might operate the crystallization as shown on the right (Figure 3):

The particles produced by these alternatives differ greatly in size.  The rushed crystallization leads to a multimodal distribution (red in Figure 2) with low average size, due to seeded growth and separate nucleation events during both antisolvent addition and natural cooling.  These crystals will be difficult to filter and forward-process.

More gradual addition, with attention to crystallization kinetics and both the addition and cooling rates, leads to larger crystals (blue in Figure 2) and a tighter distribution that can be further enhanced by optimizing seed loading, seeding temperature and the operating profiles.

From November 2017, these types of scenarios can be set up, illustrated and reported in minutes using the DynoChem Crystallization Toolbox.


* We have implemented high resolution finite volume discretisation of the CSD, using the Koren flux limiter.

Wednesday, October 25, 2017

Simulating PFRs for flow chemistry under transient upset conditions

Readers of this blog will be aware of our RTD utility that helps characterize continuous manufacturing (CM) equipment trains and also simulate the impact of process disturbances, in the absence of chemical reactions.  Pharma CM processes typically have several layers of controls to help ensure that off-spec material is diverted when necessary and as far as possible that disturbances are minimized and detected early. 

For regulatory filings or other purposes, from time to time it may be necessary to simulate transient/ upset conditions in chemically reacting systems (e.g. making drug substance intermediates or final API) to understand the additional chemical effects and to define boundaries for acceptable levels of input variation.  We have been exploring such cases and the most effective way to model them in DynoChem.  Some interesting DC Simulator plots are shown below to illustrate when and for how long such upsets might affect the exit CQA (blue) and impurity level (green) from an example PFR (average residence time 30 minutes) with a ‘typical’ side-reaction. 

Simulation of plug flow reactor with significant and frequent fluctuations in four input variables. These unusually large variations if left unchecked would lead  in this example to a breach of the CQA limit (high impurity) twice during a 3 hour operating period. 

Simulation of plug flow reactor with a feed pump failure at 90 minutes, lasting for 30 minutes.  In addition to reducing  product output, depending on which feed pump fails, this may lead to a temporary increase in impurity level until the feed is restored.

ShareThis small