Sign for Notice Everyday    Sign Up| Sign In| Link| English|

Our Sponsors

    Receive Latest News

    Feedburner
    Share Us


    Maintaining a Validated State ? PV, PM and Statistics associated with Current Regulation

    View: 206

    Website https://compliance2go.com/product/?topic=maintaining-a-validated-state-pv-pm-and-statistics-associat | Want to Edit it Edit Freely

    Category Biotechnology , Medical Device , Pharmaceutical

    Deadline: July 16, 2015 | Date: July 16, 2015

    Venue/Country: Online, U.S.A

    Updated: 2015-07-06 18:19:52 (GMT+9)

    Call For Papers - CFP

    DESCRIPTION

    Validation is a continuous improvement journey; it is not a voyage of discovery or successfully completing three conformance lots. Hence the means by which we identify the elements that need to be “validated” are often missing a few steps in the operational, cycle development and performance qualification that after we move into a normal operation may require a re-assessment as per the current PV guideline (evaluating the performance of the process identifies problems and determines whether action must be taken to correct and re-validate, anticipate, and prevent problems so that the process remains in control). This includes an on-going review of the originally defined Critical Steps, Product Test Data, Change Control, and OOS, OOL and OOT incidents which all have to be included in the Annual Product Review to demonstrate an acceptable validated state of operation and process control.

    Critical Steps, Parameters that Cause Variability, Critical Parameters,Target and Range, Product Test Data, Process Monitoring and Control, Change Control, Critical and Support Utilities, Product Processes, Computer Systems, Controlled Equipment and Facilities, Process and Laboratory Equipment, QC Laboratory Data, Reportable Values, OOS, OOL and OOT Investigations, Control Charts, Method Validation/Revalidation, Revalidation Process/Product Assessment, Process Monitoring and Control, Data Review, Trending and Analysis, Complaints and Adverse Events, Deviation Investigations/Product Impact, Assessments, Annual Product Reviews.

    Why should you attend :

    Process knowledge and understanding is the basis for establishing and approach to process control for each unit of operation specific to equipment variables in order to generate overall process control needed for validation. The current majority of serious warning letters and consent decrees have been issued against companies with years of experience but an overall miscomprehension of the validation expectations. Is a validated process that has a significant number of OOS batches considered a validated process? A system or systems for detecting unplanned departures from the process as initially designed or currently operated is essential to accomplish this objective.

    Educating the Experienced

    How do we teach old dogs new tricks?

    How do we teach new dogs new tricks?

    We have to know what the objective is, what tools we have in our tool-box, know how to use them and demonstrate the confidence to apply them.

    How to get more bang for your buck (validation is a cost savings program, but validation is viewed by executives as serious expense that would like to be avoided)

    Randomized Block Statistical Model for Effective Validation

    Grouping to have the units in a block as uniform as possible so that observed differences will be largely do to treatment and make root cause analysis simple and accurate.

    Measure uniformity within a loaded sterilizer chamber and at the same time map the chamber….in two runs we can tell as

    much about the performance of the sterilizer using a randomized block experimental design as is normally with a dozen or so experimental temperature distribution and heat penetration runs.

    Uses randomization, an expectation of uniformity, analysis of variance and an equally likely chance for each location to have an independent lethality value of all other locations in the chamber.

    Additional Description of the topic :

    Statistical Applications ? Use of a statistician or person with adequate statistical training (not just 6 Sigma) to develop the experimental design and the statistical methods / models needed to measure and evaluate process capability, uniformity and on-going stability.

    Generate A Sense of Control ? Process design and development, when assessed by the statistician, should anticipate significant sources of variability and establish statistically significant means of detection, control and mitigation strategies as well as initially finding a means of defining the alert limits and action limits that will most likely change as the process moves into a routine operation and now provides SQC and 6 Sigma opportunities. Too often, as sited in a variety of 483s, 6 Sigma was used to establish limits during development which is a significant statistical error.

    Create Consistency

    Enhance Process Understanding

    Meet Industry Regulations

    Areas Covered in the Session :

    Continued monitoring and/or sampling at the levels established during the development and qualification stages until sufficient data is available to generate statistically significant variability estimates. Once the variability is known, sampling and/or monitoring should be adjusted to a statistically significant level. Variation is to be used to detect the potential for defect complaints, OOS including OOT and OOL results, including deviation reports, process yield variations, BPR deficiencies, incoming raw material variances, adverse events and many other issues that may be found to enhance a validated (cost effective with minimal patient risk) means of operation. Hence change control becomes a critical component using SSR (sound scientific rationale) to manage an on-going validated state.

    1. Appropriate Application of Simple Statistical Tools, the Scientific Methods, facts, theories, proposals, functional requirements (FRS), acceptance criteria, formation of a hypothesis that is logical with sound scientific rationale by being scientifically based making it defendable.

    2. Objective Evaluation - To most scientists, statistics is logic or common sense with a strong admixture of arithmetic procedures. The logic supplies the method by which data are to be collected and determined how extensive they are to be. Critical thinking as opposed to assumed awareness creates significant issues during the evaluation phase of any process.

    3. The arithmetic, together with certain numeric tables, yields the material on which to base the inference and measure the level of uncertainty associated with the process variables. The arithmetic is often routine, requiring no special mathematical training for the user, however the choice of the statistical model requires significant comprehension of the process in order to select the appropriate statistical model, then make sure the sampling fits the model so the arithmetic can be done to determine how the ANOVA results can be interpreted to demonstrate a maintained validated state.

    4. Process observations and data retrieval are the raw materials with which quality and statistical workers deal which means the results need to be in the form of numbers (objective evidence) as result of a measurement, not subjective results from check boxes. Tracking the pass/fail results is only an “appetizer” to the next phase of the assessment of the validated process including equipment and operator performance. Numbers are what we us to the constitute and address the situation to determine if the data is a variability or variation.

    5. Examples of Objective Data: Yield, pressure, pH, flow rate, time and lapse time, amounts, number of defects, length, duration, and many, many more depending on the validated equipment and associated process.

    6. Our Responsibility: We are to manage the collection based on the statistical model. Presentation, characterization and, summarization of data including potential elements such as the results from the regression analysis, z-score, randomized block and a multitude of other means of assessing the consistency and identified variables of the validated process.

    7. ANOVA - Analysis of Variance

    Not used to test hypotheses about variances used to test hypotheses about means (bar x and double bar x). Lifetimes of two different types of light bulbs. Effectiveness of two different toothpastes. Now we calculate the sum of squares and variances to measure variation between and within groups. Why? - Because we are trying to test a hypothesis about the equality of the data’s average or data’s mean. Sum of squares ? sum of the variance (an individual value minus the average squared). Null Hypothesis ? we test to demonstrate that the difference between the means is null (not 0) or insignificant. Linear Regression and Correlation. Used to Predict unknown values within a bracket (minimum and maximum operating ranges qutside a bracket ? future (be careful). Line of Best Fit (Regression Line), the slope and the Y intercept define the regression line, the idea behind finding a regression line (line of best fit) is based on the assumption that the data is scattered randomly about a single straight line. Correlation Coefficient ? 1 is perfect ? either positive or negative ? 0 indicates the data have no correlation. Correlation is the term used to indicate how close to the single line the data fall

    8. Experimental Design - By now we should begin to understand the need to be able know how we want to analyze the data before we begin to collect individual observations.

    Who will benefit: (Titles)

    Statisticians, Quality, Engineering and Documentation Personnel within the Pharmaceutical, Medical Device and Solid Dosage Companies

    Webinar Includes:

    Q/A Session with the Expert to ask your question

    PDF print only copy of PowerPoint slides

    90 Minutes Live Presentation

    Certificate of Attendance


    Keywords: Accepted papers list. Acceptance Rate. EI Compendex. Engineering Index. ISTP index. ISI index. Impact Factor.
    Disclaimer: ourGlocal is an open academical resource system, which anyone can edit or update. Usually, journal information updated by us, journal managers or others. So the information is old or wrong now. Specially, impact factor is changing every year. Even it was correct when updated, it may have been changed now. So please go to Thomson Reuters to confirm latest value about Journal impact factor.