Requirements validation testing on the 7 optical fiber array connectorcable assemblies for
《软件工程》习题汇锦

《软件工程》习题汇锦一、单项选择题提示:在每小题列出的四个备选项中只有一个是符合题目要求的,请将其代码填写在下表中。
错选、多选或未选均无分.1. ( )If a system is being developed where the customers are not sure of what theywant, the requirements are often poorly defined。
Which of the following would be an appropriate process model for this type of development?(A)prototyping(B)waterfall(C)V-model(D)spiral2. ()The project team developing a new system is experienced in the domain.Although the new project is fairly large, it is not expected to vary much from applications that have been developed by this team in the past. Which process model would be appropriate for this type of development?(A)prototyping(B)waterfall(C)V-model(D)spiral3. ()Which of the items listed below is not one of the software engineering layers?(A)Process(B)Manufacturing(C)Methods(D)T ools4. ()Which of these are the 5 generic software engineering framework activities?(A)communication,planning,modeling,construction,deployment(B) communication, risk management, measurement,production, reviewing(C)analysis,designing,programming, debugging, maintenance(D)analysis, planning,designing,programming,testing5. ()The incremental model of software development is(A)A reasonable approach when requirements are well defined.(B)A good approach when a working core product is required quickly。
USP30_1208-无菌实验----隔离系统的验证(摘译)

USP30-NF25(1208)STERILITY TESTING— VALIDATION OF ISOLATOR SYSTEMS无菌实验----隔离系统的验证(摘译可供全自动无菌检查系统参考的部分)Performance Qualifications (PQ)性能确认The PQ phase verifies that the system is functioning in compliance with its operator requirement specifications. At the completion of the PQ phase, the efficacy of the decontamination cycle and, if appropriate, the adequacy of decontaminating chemical venting are verified. All PQ data are adequately summarized, reviewed, and archived.PQ阶段要确认的是系统的运行功能符合操作人员所要求的标准。
PQ完成时,净化循环的效验以及在适当的情况下化学物品通风净化的适用性被确认。
所有的PQ数据应该进行充分地总结、回顾及归档。
Cleaning Verification— In general, cleaning is not critical for sterility testing applications. However, residual products are a concern in multiproduct testing, particularly for aggressive antimicrobial agents, because these materials could interfere with the ability of subsequent tests to detect low levels of contamination in the product.清洁确认—一般而言,清洁对于无菌检测的实施并不关键。
adas ecu的设计开发流程

adas ecu的设计开发流程1.在开始设计之前,首先要确定ECU的功能需求。
Before starting the design, the functional requirements of the ECU need to be determined.2.然后进行系统架构设计,确定ECU的各个功能模块及其之间的关系。
Then the system architecture design is carried out to determine the various functional modules of the ECU and their relationships.3.接下来是软件设计,根据系统架构设计确定每个功能模块的软件实现方案。
Next is software design, based on the system architecture design to determine the software implementation plan for each functional module.4.硬件设计阶段是设计ECU的硬件电路和接口。
The hardware design phase is to design the hardwarecircuit and interfaces of the ECU.5.设计阶段完成后,进行ECU的编码和综合测试。
After the design phase is completed, the ECU is coded and comprehensively tested.6.在开发过程中,需要进行软硬件的调试和集成测试。
During the development, software and hardware debugging and integrated testing are necessary.7.开发完成后,对ECU进行验证测试,确认其符合功能需求。
gmp变更管理指南

gmp变更管理指南英文回答:Introduction.GMP change control guidelines are critical documents that define the processes and procedures for managing changes to computerized systems used in pharmaceutical manufacturing. These guidelines ensure that changes to the systems are controlled and validated to maintain the integrity of the data and the quality of the products produced.Purpose of GMP Change Control Guidelines.The primary purpose of GMP change control guidelines is to:Ensure that changes to computerized systems are controlled and validated.Minimize the risk of introducing errors or compromising the integrity of the data.Maintain compliance with regulatory requirements.Key Elements of GMP Change Control Guidelines.GMP change control guidelines typically include the following key elements:Scope: The scope of the guidelines defines the systems and processes to which they apply.Roles and Responsibilities: The guidelines assign roles and responsibilities for the various aspects of change control, including initiation, review, approval, and implementation.Change Request Process: The change request process describes the steps involved in requesting, reviewing, and approving changes to the system.Impact Assessment: The guidelines include proceduresfor assessing the potential impact of changes on the system, including its functionality, data integrity, and security.Validation and Testing: The guidelines specify the requirements for validating and testing changes to the system to ensure that they meet the intended requirements.Documentation: The guidelines require that all changes to the system be documented and maintained for audit purposes.Benefits of GMP Change Control Guidelines.GMP change control guidelines provide numerous benefits, including:Increased data integrity.Reduced risk of errors.Improved compliance.Enhanced system performance.Reduced downtime.中文回答:GMP变更管理指南。
西南大学及迪肯大学软件工程课程英文试题

Question 1: What is the software process and why is it important? List out the four main software process models, and describe the drawbacks of RAD Model? Discuss the primary differences between incremental, iterative and agile models.Answer:1)What is the software process and why the process model is important-A software process (also known as software methodology) is a set of related activities that leads to the production of the software. These activities may involve the development of the software from the scratch, or, modifying an existing system.Any software process must include the following four activities:1.Software specification (or requirements engineering): Define the main functionalities of thesoftware and the constrains around them.2.S oftware design and implementation: The software is to be designed and programmed.3.Software verification and validation: The software must conform to its specification and meetsthe customer needs.4.Software evolution (software maintenance): The software is being modified to meet customerand market requirements changes.In practice, they include sub-activities such as requirements validation, architectural design, unit testing, …etc.There are also supporting activities such as configuration and change management, quality assurance, project management, user experience.Along with other activities aim to improve the above activities by introducing new techniques, tools, following the best practice, process standardization (so the diversity of software processes is reduced), etc.When we talk about a process, we usually talk about the activities in it. However, a process also includes the process description, which includes:1.Products: The outcomes of the activity. For example, the outcome of architectural designmaybe a model for the software architecture.2.Roles: The responsibilities of the people involved in the process. For example, the projectmanager, programmer, etc.3.Pre and post conditions: The conditions that must be true before and after an activity. Forexample, the pre condition of the architectural design is the requirements have been approvedby the customer, while the post condition is the diagrams describing the architectural have been reviewed.Software process is complex, it relies on making decisions. There’s no ideal process and most organizations have developed their own software process.For example, an organization works on critical systems has a very structured process, while with business systems, with rapidly changing requirements, a less formal, flexible process is likely to be more effective.2)Four main process model: Waterfall model, incremental model, iterative model and agilemodel3)Drawbacks of RAD Model are :•Require sufficient number of Human Resources to create enough number of teams. •Developers and Customers are not committed, system result in failure.•Not Properly Modularized building component may Problematic.•Not applicable when there is more possibility for Technical Risk.4)The primary differences between incremental, iterative and agile models:Each increment in the incremental approach builds a complete feature of the software, while in iterative, it builds small portions of all the features.An agile approach combines the incremental and iterative approach by building a small portion of each feature, one by one, and then both gradually adding features and increasing their completeness.============================================================================Question 2: What software life-cycle model would you use if there is significant technical risk and the customer's requirements are not well-known in advance? Justify your answer in a short essay.Question 3: Consider an automated library circulation system.o Every book has a bar code and every borrower has a card bearing a bar code.o When a borrower wishes to check out a book, the librarian scans the bar code on the book and on the borrower's card, and then enters C at the computer terminal.o When a book is returned, it is again scanned and the librarian enters R .o Librarians can add books ( + ) to the library collection or remove them ( - ).o Borrowers can go to a terminal and determine:▪ all the books in the library by a particular author(the borrower types A= followed by the author's name)▪ all the books with a specific title(the borrower types T= followed by the title)▪ all the books in a particular subject area(the borrower types S= followed by the subject area)o If a borrower wants a book that is currently checked out, the librarian can place a hold on the book so that, when it is returned, it will be held for the borrower who requested it (the librarian types H= followed by the number of the book)Questions1) Draw a dataflow diagram for the system, showing as much detail as you can without making assumptions about implementation.2) Draw a state transition diagram for the system.The dataflow diagram for the entire system is shown immediately below. The Process Command process is shown using a hierarchical expansion in the second diagram.NOTE about the diagram:▪ The diagram uses separate states for reading commands and bar codes to show the sequence in which the bar codes and commands are expected.。
计算机英语(第2版)新增答案

《计算机英语(第2版)》参考答案注:这里仅给出《计算机英语(第2版)》新增或变化课文的答案,其他未改动课文答案参见《计算机英语(第1版)》原来的答案。
Unit OneSection CPDA Prizefight: Palm vs. Pocket PCI. Fill in the blanks with the information given in the text:1. With DataViz’s Documents To Go, you can view and edit desktop documents on your PDA without converting them first to a PDA-specific ________. (format)2. Both Palm OS and Windows Mobile PDAs can offer e-mail via ________ so that new messages received on your desktop system are transferred to the PDA for on-the-go reading. (synchronization)3. The Windows Mobile keyboard, Block Recognizer, and Letter Recognizer are all ________ input areas, meaning they appear and disappear as needed. (virtual)4. Generally speaking, Windows Mobile performs better in entering information and playing ________ files while Palm OS offers easier operation, more ________ programs, better desktop compatibility, and a stronger e-mail application. (multimedia; third-party)II. Translate the following terms or phrases from English into Chinese and vice versa:1. data field数据字段2. learning curve学习曲线3. third-party solution第三方解决方案4. Windows Media Player Windows媒体播放器5. 开始按钮Start button6. 指定输入区designated input area7. 手写体识别系统handwriting-recognition system8. 字符集character setUnit ThreeSection BLonghorn:The Next Version of WindowsI. Fill in the blanks with the information given in the text:1. NGSCB, the new security architecture Microsoft is developing for Longhorn, splits the OS into two parts: a standard mode and a(n) ________ mode. (secure)2. It is reported that Longhorn will provide different levels of operation that disable the more intensive Aero effects to boost ________ on less capable PCs. (performance)3. With Longhorn’s new graphics and presentation engine, we can create and display Tiles on the desktop, which remind us of the old Active Desktop but are based on ________ instead of ________. (XML; HTML)4. The most talked-about feature in Longhorn so far is its new storage system, WinFS, whichworks like a(n) ________ database. (relational)II. Translate the following terms or phrases from English into Chinese and vice versa:1. search box搜索框2. built-in firewall内置防火墙3. standalone application独立应用程序4. active desktop 活动桌面5. mobile device移动设备6. 专有软件proprietary software7. 快速加载键quick-launch key8. 图形加速器graphics accelerator9. 虚拟文件夹virtual folder10. 三维界面three-dimensional interfaceUnit FourSection CArraysI. Fill in the blanks with the information given in the text:1. Given the array called object with 20 elements, if you see the term object10, you know the array is in ________ form; if you see the term object[10], you know the array is in ________ form. (subscript; index)2. In most programming languages, an array is a static data structure. When you define an array, the size is ________. (fixed)3. A(n) ________ is a pictorial representation of a frequency array. (histogram)4. An array that consists of just rows and columns is probably a(n) ________ array. (two-dimensional)II. Translate the following terms or phrases from English into Chinese and vice versa:1. bar chart条形图2. frequency array频率数组3. graphical representation图形表示4. multidimensional array多维数组5. 用户视图user(’s) view6. 下标形式subscript form7. 一维数组one-dimensional array8. 编程结构programming constructUnit FiveSection BMicrosoft .NET vs. J2EEI. Fill in the blanks with the information given in the text:1. One of the differences between C# and Java is that Java runs on any platform with a Java Virtual ________ while C# only runs in Windows for the foreseeable future. (Machine)2. With .NET, Microsoft is opening up a channel both to ________ in other programming languages and to ________. (developers; components)3. J2EE is a single-language platform; calls from/to objects in other languages are possiblethrough ________, but this kind of support is not a ubiquitous part of the platform. (CORBA)4. One important element of the .NET platform is a common language ________, which runs bytecodes in an Internal Language format. (runtime)II. Translate the following terms or phrases from English into Chinese and vice versa:1. messaging model消息收发模型2. common language runtime通用语言运行时刻(环境)3. hierarchical namespace分等级层次的名称空间4. development community开发社区5. CORBA公用对象请求代理(程序)体系结构6. 基本组件base component7. 元数据标记metadata tag8. 虚拟机virtual machine9. 集成开发环境IDE(integrated development environment)10. 简单对象访问协议SOAP(Simple Object Access Protocol)Unit SixSection ASoftware Life CycleI. Fill in the blanks with the information given in the text:1. The development process in the software life cycle involves four phases: analysis, design, implementation, and ________. (testing)2. In the system development process, the system analyst defines the user, needs, requirements and methods in the ________ phase. (analysis)3. In the system development process, the code is written in the ________ phase. (implementation)4. In the system development process, modularity is a very well-established principle used in the ________ phase. (design)5. The most commonly used tool in the design phase is the ________. (structure chart)6. In the system development process, ________ and pseudocode are tools used by programmers in the implementation phase. (flowcharts)7. Pseudocode is part English and part program ________. (logic)8. While black box testing is done by the system test engineer and the ________, white box testing is done by the ________. (user; programmer)II. Translate the following terms or phrases from English into Chinese and vice versa:1. standard graphical symbol标准图形符号2. logical flow of data标准图形符号3. test case测试用例4. program validation程序验证5. white box testing白盒测试6. student registration system学生注册系统7. customized banking package定制的金融软件包8. software life cycle软件生命周期9. user working environment用户工作环境10. implementation phase实现阶段11. 测试数据test data12. 结构图structure chart13. 系统开发阶段system development phase14. 软件工程software engineering15. 系统分析员system(s) analyst16. 测试工程师test engineer17. 系统生命周期system life cycle18. 设计阶段design phase19. 黑盒测试black box testing20. 会计软件包accounting packageIII. Fill in each of the blanks with one of the words given in the following list, making changes if necessary:development; testing; programmer; chart; engineer; attend; interfacessystem; software; small; userdevelop; changes; quality; board; UncontrolledIV. Translate the following passage from English into Chinese:软件工程是软件开发的一个领域;在这个领域中,计算机科学家和工程师研究有关的方法与工具,以使高效开发正确、可靠和健壮的计算机程序变得容易。
STATEMENT OF WORK工作说明译文
‚STATEMENT OF WORK工作说明TABLE OF CONTENTS目录SECTION 1.0 SUMMARY第一部分概要SECTION 2.0 GENERAL INFORMATION第二部分通用信息SECTION 3.0 PROGRAM MANAGEMENT第三部分项目管理SECTION 4.0 PRODUCT ENGINEERING第四部分产品工程SECTION 5.0 BUILD REQUIREMENTS第五部分概SECTION 6.0 SUMMARY OF DELIVERABLE ITEMS, APPROVALS AND SCHEDULES 第六部分可交付项目,认证和规划概论SECTION 7.0ENGINEERING SPECIFICATIOINS第七部分工程规范1.0 SUMMARY概要1.1 Executive Summary *1执行概要This SOW specifies tasks to be provided by the Supplier during the design, development,qualification, production and service life of theIsuzuRG01/RJ01/GM S1xx program. Theprogram includes all tasks associated with the design and supply INSULATOR;HOODand/or related tools or systems in all program phases vehicle development process. Thisdocument describes the direction, timing and services to be provided to the Supplier, and/or systems to be received from the Supplier during the course of this program. It is intendedthat this document along with all referenced documents shall convey the programrequirements to the Supplier, however, it does not preclude other requirements from beingincorporated into the component or system.《工作说明书》指的是在开发ISUZU RG01/RJ01/GM S1xx项目的设计,开发,认证,量产和使用期间由供应商开展的工作。
VERIFICATION, VALIDATION, AND ACCREDITATION IN THE LIFE CYCLE OF MODELS AND SIMULATIONS
Proceedings of the 2000 Winter Simulation ConferenceJ. A. Joines, R. R. Barton, K. Kang, and P. A. Fishwick, eds.VERIFICATION, VALIDATION, AND ACCREDITATION IN THELIFE CYCLE OF MODELS AND SIMULATIONSJennifer ChewHQ, U.S. Army Developmental Test CommandATTN: CSTE-DTC-TT-M Aberdeen Proving Ground, MD 21005-5055, U.S.A.Cindy SullivanU.S. Army Yuma Proving Ground ATTN: CSTE-DTC-YP-CD, Building 2105 Yuma, AZ 85365, U.S.A.ABSTRACTVerification, validation, and accreditation (VV&A) activities should be an on-going process throughout the life cycle of models and simulations (M&S). It is important to note that there is no single set of VV&A tasks, events, or methods that would apply every time to every situation. The VV&A emphasis and methods used vary depending on the particular life cycle phase it is in, previous VV&A and use, the risks and uncertainty, its size and complexity, and of course, the resources available. For simplification, this paper discusses the activities and tasks during the early stages of model development and addresses each of the VV&A efforts separately, along with its associated activities. It outlines the specific VV&A activities and products that are appropriate to each phase of model development.1INTRODUCTIONIn recent years, the Department of Defense (DoD) has aggressively applied M&S in wargaming, analysis, design, testing, etc., to support acquisition decisions. One caveat is that if the model is intended to be used by DoD, then the model must be verified and validated to ensure that the simulation outputs are sufficiently credible for its intended use(s). While the DoD is responsible for its own M&S, M&S that are developed and/or used by industry and academia in support of DoD acquisition activities must also comply with the DoD VV&A policy. The information presented herein has been compiled from a wide variety of sources, including DoD directives and instructions related to M&S management and VV&A, software industry standards and practices, and academic text and professional literature.The VV&A activities contained herein are broadly applicable to all stand-alone models and federates which are used for supporting DoD acquisition decisions. Federates are individual M&S products that are capable of joining High Level Architecture—based federations. This paper does not cover the VV&A on a federation of models. VV&A of a federation must be completed after doing VV&A on each of its federates. The activities described in this paper are intended to be used for planning, producing, and documenting proper evidence to support the VV&A of M&S. This paper is also intended to help the reader to plan for and develop structured and organized VV&A activities; provide a systematic approach for preparing VV&A documentation; and give a better understanding of how VV&A can be an integral part of the M&S life cycle. It emphasizes activities that are crucial during each phase of M&S development and use.Too often Verification and Validation (V&V) are considered separately from development and documen-tation. The V&V plans and process should begin on the first day of development and continue in such a manner that the same documentation used for requirements, design, development, and configuration control also serves to support V&V activities. Finding and resolving problems early via application of V&V can significantly reduce the subsequent cost of M&S design, development, and testing. There are many V&V tasks that the M&S developer should be doing before and during model development. As a matter of fact, VV&A activities should begin as soon as there is a decision to apply M&S to a problem. The planning effort for VV&A is as important as implementing it. The earlier we start the V&V planning, the easier it is to implement. It is always good practice to ensure that all pertinent information is documented along the way.It is important to note that all the VV&A activities are tailorable to the specific requirements. Unless there is high impact given a failure (e.g., cost or safety) or it is a very large and/or complex developmental effort, we probably do not need to accomplish every task or method mentioned in this paper. There is no single set of VV&A tasks, events, or methods that applies exclusively every time to every situation. VV&A emphasis and methods used vary depending on the particular life cycle phase it is in, previous VV&A and use, the risks and uncertainty, its size and complexity, and resources available. The depth of analysis involved with the V&V of an established legacy model would be different from the development of a new M&S. Likewise, the available information for the accreditation of legacy model might be based more onhistorical performance than results from the detailed tasks outlined in this paper for a new M&S.There are many ways and techniques to accomplish VV&A. Although there is an abundance of literature on VV&A advocating diverse methods, this paper compresses the information to provide a simplified process that focuses on the activities and tasks during each phase of the development. For simplification, this paper addresses the VV&A activities and products that apply to each M&S development phase.2VV&A IN THE LIFE CYCLE OF M&SFigure 1 shows a typical life cycle of an M&S and its associated VV&A activities. These activities or tasks may be tailored and applied differently based on the depth of analysis, as required by the user or established acceptability criteria. The authoritative data source (ADS) library, as shown in Figure 1, contains DoD data sources used for supporting M&S which are cataloged through the available through the Defense Modeling and Simulation Office website at <www.dmso.mi l>.The remainder of this paper examines each of the VV&A phase and discusses the activities associated with them.2.1Requirements Verification and ValidationThe M&S development should begin with a clear and unambiguous statement of the problem that the M&S are intended to address. A good definition of the problem makes it easier to define M&S requirements such as simulation outputs, functions, and interactions. It is also important to specify, at least in general terms, how much like the real world the user needs these outputs, functions, and interactions to be. We believe that the most critical piece of the M&S development and V&V activities falls in the very beginning of the life cycle. If the requirements do not make sense or not well understood, then the M&S will not do what was originally intended.Basically, this phase of the process is primarily involved in reviewing the requirement documentation and in documenting all findings. The review focuses on the intended use, acceptability criteria for model fidelity, traceability, quality, configuration management, and fidelity of the M&S to be developed. This is done to ensure that all the requirements are clearly defined, consistent, testable, and complete.The first step is to gather information. Any informa-tion related to the M&S and its requirements increases understandability of the requirements and making the right decisions. It may not be obvious that one of the most criti-cal V&V effort is to review all the information gathered and document all the findings. This could include:•Requirements•Interface requirements•Developmental plans•Previous V&V plans and results•Configuration Management Plan•Quality Assurance Plans•Studies and AnalysesDocumenting all the findings, assumptions, limitations, etc., from reviewing every piece of related information about the M&S, is extremely important. We review the requirement documentation, determine the risk areas, and assess the criticality of specific factors that need the most attention. Again, we document the assessment and highlight the areas that may need further analysis. We report all the findings to the sponsor/user and have all the discrepancies resolved before continuing with any further major efforts.The following should be considered when tailoring. If the intended use is not adequately documented, the V&V team may need to talk to the users and document the intended use themselves. If the model has interfaces, these need to be verified to determine if the interface structure is adequate. User interfaces need to be analyzed to determine how accurately the interface is integrated into the overall M&S and for human factors engineering, for example, requirements to accommodate the number, skill levels, duty cycles, training needs, or other information about the personnel who will use or support the model. If this is a developmental effort or the developers are available, the V&V team may be able to participate in requirements review and ask the developers questions face-to-face. The following system engineering factors may be important to assess for adequacy:•adaptation of installation independent data•safety (prevent/minimize hazards to personnel, property, and physical environment)•security and privacy•for software, the computer hardware and oper-ating system•for hardware, the environment during transpor-tation, storage, and operation, e.g., wind, rain,temperature, geographical location, motion,shock, noise, and electromagnetic radiation •computer resources used by the software or incorporated into the hardware•design and construction constraints•logistics•packagingThe requirements V&V phase culminates with the documentation of the intended use, requirements traceability matrix, unsupported requirements, acceptability criteria for model fidelity, risk assessment, and model fidelity.2.2Conceptual Model Verification & ValidationA conceptual model is a preliminary or proposed design framework that is based on the outputs, functions, and interactions defined during the requirements V&V described in Section 2.1. A conceptual model typically consists of a description of how the M&S requirements are broken down into component pieces, how those pieces fit together and interact, and how they work together to meet the requirements specified. It should also include a description of the equations and algorithms that are used to meet the requirements, as well as an explicit description of any assumptions or limitations made or associated with the theories, concepts, fidelity, derivatives, logic, interfaces, or solution approaches. The process of determining the adequacy of the conceptual model and ensuring that it meets the specified requirements and intended use(s) is called conceptual model V&V.One of the initial tasks for conceptual model V&V is to come to finalize and agree with the acceptability criteria for model fidelity and to define the criticality of data inputs and outputs. The importance of data is discussed in Section 2.6. Acceptability criteria and data requirements are used to ensure that each step of the conceptual model framework is traceable to the requirements, and ultimately to these criteria. These criteria are established by the accreditation approval authority defining the terms and conditions of the M&S that will be considered acceptable for the application. Therefore, a set of test cases must be defined to ensure that all the simulation scenarios and trials will adequately address the requirements and satisfy the acceptability criteria. It is crucial that we verify and validate the conceptual model adequately from which the code is generated and/or hardware is built.The products of conceptual model V&V are model characteristics, input/output data items, interface issues, measure of model fidelity, potential weaknesses and limitations, perceived strengths, and traceability between conceptual model and requirements.2.3Design VerificationAfter the conceptual model is verified and validated, the developer produces a detailed design that describes exactly how the conceptual model will be coded or fabricated. It defines the components, elements, functions, and specifications that will be used to produce the simulation based on the conceptual model. Before a single line of software code is written or hardware is fabricated, we should review the detailed design to ensure it conforms to the conceptual model. This step is called Design Verification. It involves a mapping of the proposed design elements back to the conceptual model and requirements to ensure that there is traceability between those requirements and the proposed design. We should also develop test cases that can be traced back to the design and requirements.Although traceability is the main focus during the design verification, other activities such as participating in design reviews, audits, walkthroughs, and inspections are important. For software, it is also important to verify input data; determine computer-aided software engineering tools and design methodology; conduct internal software testing; and perform software metrics analysis. For hardware, it is important for subject matter experts to review the adequacy of drawings (e.g., schematic drawings), interface control drawings, and, as appropriate, the adequacy of the electrical design, mechanical design, power generation and grounding, electrical and mechanical interface compatibility, and mass properties.This phase culminates with the traceability matrix (detailed design to requirements, to conceptual model, and to test cases), design and requirement cross reference matrix, design walkthrough or inspection report, input data verifica-tion, software metric and test reports, and CASE tools.2.4Code Verification and Hardware CheckoutAfter the design is verified, the conceptual model and its associated design are converted into code or hardware by the developer. Code verification and hardware checkout ensure that the detailed design is being implemented correctly in the code or hardware respectively.Code verification normally entails detailed desk checking and software testing of the code, comparing it to the detailed design, documenting any discrepancies and fixing any problems discovered. Other important activities include participating in code testing, audits, walkthroughs, and inspections; validating input data; preparing complexity report; conducting code analysis; and verifying code structure.Hardware checkout entails reviews, audits and inspec-tions, comparing the hardware to its design, documenting any discrepancies and fixing any problems.This phase culminates with the design functionality, code walkthrough or inspection report, complexity metric report, input data validation, coding/interface/logic errors, and syntax and semantics.2.5Code and/or Hardware TestingAfter the design and the initial implementation are com-pleted, the developer integrates the code and/or hardware together and tests it. These tests are intended to verify and validate the M&S. Verification tests the correctness of the M&S to ensure that it accurately represents the developer’s requirements, conceptual description, and design. Validation tests the extent to which an M&S accurately represents the real world from the perspective of the intended use of the M&S.Verification tests that the M&S requirement, concep-tual model and design are implemented as documented in the previous phases. Acceptance testing determines whether all requirements are satisfied. Compliance testing determines if the simulation meets required security and performance standards. Test cases should be traceable to the documented requirements and design to ensure that all were met. Metrics that may be used, if this is a large software development, include breadth and depth of testing, fault profiles, and reliability metrics. The breadth of testing metric (% requirements tested and % requirements addressed) address the degree to which required functionality has been successfully demonstrated as well as the amount of testing that has been performed. The depth of testing metric (% tested and passed testing) measures the amount of testing achieved on the software architecture, that is, the extent and success of testing the possible control and data paths and conditions within the software. Automated tools may be used to compute this measure. Fault profiles (open versus closed anomalies) provides insight into the number and type of deficiencies in the current baseline, as well, as the developer’s ability to fix known faults. The reliability metric (mean time between failures) expresses the contribution to reliability.The two issues that must be addressed during valida-tion testing are to identify the real world being modeled and to identify the key structural characteristics and output parameters that are to be used for comparisons. In other words, validation has to do with the fidelity of the M&S. Fidelity is normally defined by the sponsor/user and is judged by several factors, one of which is its ability to predict the known behavior, or best estimate, of the real system when subjected to the same stimuli. The fidelity level is actually defined when the sponsor/user establishes the acceptability criteria for model fidelity. If the M&S is designed with these criteria in mind, then very likely the M&S will fall within the defined fidelity boundary and be acceptable by the sponsor/user. Otherwise, there is achance of going back to the drawing board. Defining the acceptability criteria up-front is crucially important.In those cases where there is no user or the user simply cannot come up with a set of criteria, we should make sure that all pertinent information about the M&S and the assumptions are documented every step of the way. As a user, validation, by far, is the most important phase of the M&S life cycle. Validation gives solid evidence to help analyze the extent to which the M&S are representing the real world. It is also critical that we assess the degree of detail that must be represented in the simulation to provide acceptable results and the degree of correspondence with real world phenomena that will be sufficient for use with high confidence. If the significant parameters of a real system have been properly incorporated into a model, a simulation experiment should reflect the behavior of a real system down to some level of detail commensurate with that description.Many validation techniques such as using subject matter experts, comparison techniques, and face validation to just name a few. Validation based upon direct com-parison of model results to the real world provides more credibility than other validation methods. Selection of techniques is based on the user’s needs, M&S types, intended uses, and other factors.Despite of the techniques used, the following products should be generated as part of the testing: model fidelity assessment; traceability between requirements, design, and test cases; subject matter expert opinions; M&S and real world comparison; model limitation and impact statement; sensitivity analysis report; test results; and metric report. 2.6AccreditationAccreditation is the official determination by the user that the capabilities of the M&S fit the intended use and that the limitations of the M&S will not interfere in drawing the correct conclusions. Accreditation planning should not wait until after the development is completed. It should begin when the requirements were being verified and validated because the first task, when preparing the accreditation plan, is to develop the acceptability criteria. Acceptability criteria established in the accreditation plan are what the user has identified as key characteristics for use in deciding whether or not to grant an accreditation for the particular M&S. Accreditation occurs at two levels: Class of Applications and Application-specific.Accreditation at the Class of Applications level ac-credits an M&S for a generic set of purposes or applications and includes reviewing a complete audit trail of the development and use of the M&S. The audit trail includes reviews of M&S documentation, V&V documentation, configuration control, M&S assumptions, previous successful uses, and recognition of users’ acceptances.Accreditation of Application-specific level M&S in-cludes data certification, scenarios, and the qualification/ training of the operator-analysts who will use the M&S.All M&S are driven by data, either as direct inputs or as embedded values that drive simulation characteristics. As perfect as the equations, algorithms, and software design of an M&S may be after conceptual model validation and design verification, it will probably fail results validation if the data that drive the simulation are inaccurate or inappropriate for the task at hand. A relationship clearly exists between producer data V&V activities and user data V&V requirements throughout the M&S life cycle. However, there is a distinction between data V&V activities performed by the producer and by the user. Producer data V&V determine data quality in terms of correctness, timeliness, accuracy, completeness, relevance, and accessibility that make data appropriate for the purpose intended and values are within the stated criteria and assumptions. User data V&V ensure that the data are transformed and formatted correctly and that the data meet user specified constraints. Data accreditation is an integral part of the M&S accreditation procedures to ensure that M&S data are verified as correct, and validated as appropriate and reasonable for the intended application.3CONCLUSIONSVV&A may sound challenging or even impossible. This should not be the case if proper VV&A activities are conducted throughout the M&S life cycle, especially during the early stages. Early VV&A planning can reduce or even eliminate many concerns that may arise at later stages. In fact, early planning can also allow you more flexibility in selecting the right V&V techniques and activities to fit the specific needs. However, many situations exist during the M&S planning stage. For example,•Model acceptability criteria and V&V requirements/planning must be established andagreed upon by all parties concerned before anyactivities are defined.•V&V activities can be very labor-intensive and must be focused and carefully scoped according tospecific accreditation requirements.•V&V plan changes as the M&S project matures.V&V planning should not be considered finaluntil after V&V has actually been accomplished.•Validation depends on the intended use and fidelity of the M&S, and it will likely change asnew users are identified.•V&V should begin on day one of the M&S development, should be an integral part of theM&S development, and should be a continuousprocess.•When planning for V&V activities, alternate methods should be included to facilitate scheduledriven events and to adjust as new techniques aredeveloped.•V&V efforts require an experienced and well-trained team.ACKNOWLEDGMENTSThe authors would like to recognize Mr. Bob Lewis, Tecmaster, Inc., for his support to the development of the VV&A activities in the Life Cycle of M&S. His significant contributions have made this paper possible.REFERENCESKnepell, P.L. 1999. VV&A of Models and Simulations (A Five-Day Workshop) Participant Guide. Peak Quality Services, Colorado Springs, CO.Department of Defense. 1996. Department of Defense Verification, Validation and Accreditation (VV&A) Recommended Practices Guide. Defense Modeling and Simulation Office, Alexandria, VA. (Co-authored by: O. Balci, P.A. Glasow, P. Muessig, E. H. Page, J.Sikora, S. Solick, and S. Youngblood).Department of the Army. Army Regulation 5-11. 1997.Management of Army Models and Simulations, Washington, DC.U.S. Army Developmental Test Command (DTC). 1998.Developmental Test Command Verification, Validation, and Accreditation (VV&A) Methodology.DTC Pamphlet 73-4, Aberdeen Proving Ground, MD. Department of the Army. 1999. Verification, Validation, and Accreditation of Army Models and Simulations.Pamphlet 5-11, Army Modeling and Simulation Office, Cystal City, VA.AUTHOR BIOGRAPHIESJENNIFER CHEW is an Electronics Engineer in the Technology Management Division, HQ U.S. Army Developmental Test Command (DTC), Aberdeen Proving Ground, Maryland. She supports the development of the DTC Virtual Proving Ground program and has the lead in developing the DTC VV&A process and methodology. She received her B.S. in Chemical Engineering from University of Maryland and M.S. in Electrical Engineering Science from Loyola College. She is a graduate of the Army Management Staff College and Quality and Reliability Engineering program. Her email address is <chewj@>.CINDY L. SULLIVAN is an Operations Research Analyst and manages the Yuma Proving Ground Virtual Proving Ground Program. She received her B.S. in Computer Science from Freed-Hardeman College and M.S. in Industrial Engineering from the University of Missouri–Columbia. She has 14 years of experience working with Army M&S and earned two Army Achievement Medals. She was the primary author of DTC Pam 73-4 M&S VV&A Methodology. Her email address is <Cindy.Sullivan@>.。
黄宝斌---药品检验专业英语
4. 样品移交
对照GPCL,找到每个环节相对应的词汇:
1. 检验申请表:standard test request form 2. 收样:incoming samples, samples received by laboratories 3. 外观检查:visual inspection of submitted sample
- 词汇本身就是一种对照
- 对照明确目标和差距
对照材料:需要质量可靠的中文翻译
对照操作:
1. 学词汇:先读中文,再读英文,在读的过程中对照
2. 学习翻译和写作:先中译英,再将译文与英文对照
National Institutes for Food and Drug Control
ICH技术要求和WHO药品专家委员会报告
4. 登记:registration
5. 留样:retention, 移交:transfer
National Institutes for Food and Drug Control
以“收样环节”为例学习相关词汇(续)
7. 样品数量的要求: • a sample to be taken and divided into three approximately equal portions for submission to the laboratory:(样品需要均分为三等份)
National Institutes for Food and Drug Control
药检专业英语词汇-药检实验室质量管理体系
WHO GPCL-(Good Practice for pharmaceutical quality control laboratories) WHO GPCL 包括 4个部分, 21个条目:
检验过程英语范文
检验过程英语范文Testing ProcessIntroductionTesting is an essential part of the software developmentlife cycle. It involves evaluating and verifying the functionality, performance, and reliability of a software product. Proper testing helps to identify defects and bugs, ensuring that the software meets the requirements and quality standards set by the stakeholders. In this article, we will discuss the testing process, including its goals, types of testing, and the steps involved.Goals of TestingThe primary goal of testing is to find as many defects as possible to ensure the software's quality. However, testing also helps to achieve other important objectives:1. Verification and Validation: Testing helps to verify that the software meets the specified requirements and validate that it works as expected.2. Reliability and Stability: Testing helps to improve the software's reliability and stability, ensuring that it operates without errors or crashes.3. Performance: Testing evaluates the software's performance, ensuring that it performs efficiently under different loads and scenarios.5. Security: Testing evaluates the software's security measures, identifying vulnerabilities and ensuring data protection.Types of Testing3. System Testing: It tests the entire software system as a whole to validate its functionality, performance, andreliability.4. Acceptance Testing: This type of testing is performed by end-users to validate if the software meets their requirements and expectations.5. Performance Testing: It evaluates how the software performs under different workloads and stress conditions.6. Security Testing: This testing ensures that the softwareis secure against threats and vulnerabilities.7. Usability Testing: It focuses on evaluating thesoftware's user interface and user experience.Steps Involved in the Testing ProcessThe testing process generally follows a series of steps:1. Test Planning: In this step, the testing team determines the scope, objectives, and resources required for testing. The test plan is created, including the testing approach, test cases, and schedules.2. Test Design: In this step, the test cases are designed based on the requirements and specifications. The test scenarios and data are also identified.3. Test Environment Setup: The required hardware, software, and test data are set up for testing. The test environmentshould replicate the production environment as closely as possible.5. Defect Tracking: In this step, the identified defects are logged in a defect tracking system. The defects are categorized, prioritized, and assigned to the development team for fixing.6. Test Reporting: Test reports are generated, summarizing the testing activities, including the test coverage, test results, and defect metrics.7. Retesting and Regression Testing: Once the defects are fixed, the retesting is performed to ensure that the defectshave been resolved. Regression testing is also conducted toverify that the changes or fixes have not introduced new defects.ConclusionThe testing process is crucial to ensure the quality and reliability of software products. It helps to identify defects and verify that the software meets the specified requirements. By following the steps involved in the testing process and performing different types of testing, developers can deliver software that is functional, performant, and secure.。
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
Requirements validation testing on the 7 optical fiber array connector/cable assemblies for the Lunar Reconnaissance Orbiter (LRO)Melanie N. Ott a*, Xiaodan “Linda” Jin b, Frank V. LaRocca c, Adam Matuszeski aRichard F. Chuska c, Shawn L. MacMurphy ca NASA Goddard Space Flight Center, Greenbelt Maryland 20771b Perot Systems Government Services, Fairfax, VA 22031c MEI Technologies, Seabrook Maryland 20706ABSTRACTIn the past year, a unique capability has been created by NASA Goddard Space Flight Center (GSFC) in support of Lunar Exploration. The photonics group along with support from the Mechanical Systems Division, developed a seven fiber array assembly using a custom Diamond AVIM PM connector for space flight applications. This technology enabled the Laser Ranging Application for the LRO to be possible. Laser pulses at 532 nm will be transmitted from the earth to the LRO stationed at the moon and used to make distance assessments. The pulses will be collected with the Laser Ranging telescope and focused into the array assemblies. The array assemblies span down a boom, through gimbals and across the space craft to the instrument the Lunar Orbiter Laser Altimeter (LOLA). Through use of a LOLA detector the distance between the LRO and the Earth will be calculated simultaneously while LOLA is mapping the surface of the moon. The seven fiber array assemblies were designed in partnership with W.L. Gore, Diamond Switzerland, and GSFC, manufactured by the Photonics Group at NASA Goddard Space Flight Center (GSFC) and tested for environmental effects there as well. Presented here are the requirements validation testing and results used to insure that these unique assemblies would function adequately during the Laser Ranging 14-month mission. The data and results include in-situ monitoring of the optical assemblies during cold gimbal motion life-testing, radiation, vibration and thermal testing.Keywords: Optical, Fiber, Array, LIDAR, Ranging, Connector, Spaceflight, LRO, LOLA, Qualification1.INTRODUCTIONThe Laser Ranging (LR) mission was an add-on to the LRO soon after it was demonstrated that laser pulses between the Mercury Laser Altimeter, a previous NASA GSFC Instrument launched in August 2004, could span over 24 million kilometers of space to the station at Greenbelt MD and back.[1-2] The motivation behind precise distance measurements between the LRO and the earth is to enhance the existing gravity model by the Principal Investigators David Smith and Maria Zuber. The challenge was how to get light from earth based laser pulses over to the other side of the LRO to a detector based on the LOLA instrument while it simultaneously was focused its laser and receiver optics on the moon. The solution was to use a long fiber optic cable to move the light from the LR receiver telescope across the X-Y High Gain Antenna System (HGAS) gimbals, down the HGAS boom, across a one-time deployable mandrel, then around the other side of the space craft to a LOLA detector. The total distance is less than 10 meters but under, bending constraints, cold temperatures and motion. The seven fiber bundle design was investigated and due to the high performance requirements, the decision to use the fiber bundle along with a Custom Diamond AVIM optical fiber connector was made early during design. The seven hole pattern was drilled into a stainless steel version of the Diamond AVIM ferrule to be compatible with a slightly larger polarization maintaining (PM) type connector. The purpose of using a PM type of connector was for the purpose of interconnection of fiber bundle assembly to fiber bundle assembly. The effort continued through development since November of 2005 and is currently in flight integration on the LRO. The following details the testing and lessons learned while developing the optical fiber bundle assemblies for the Laser Ranging mission on LRO.*Melanie.N.Ott@, 301-286-0127, URL: /photonics2. EXPERIMENTAL DISCUSSIONThe Laser Ranging application required a total of three cables to allow ease during integration with several subsystems. The subsystems are the gimbals, the HGAS boom arm and the LRO space craft. Due to the necessity of three assemblies, the interconnection between those assemblies required special attention. The optical fiber array connector ferrules were a key item for focus in regards to regulating the insertion loss at those interfaces. In addition, for reliability, the termination procedure was a high focus area as well.The fiber used for the bundle was manufactured by Polymicro Technologies and the FON 1416 cable was manufactured by W.L. Gore. The fiber ferrules were made of stainless 416 and are Diamond Switzerland drawing number 070-040-230V001_55; the Custom PM connector kit is Diamond D 6206.6/S NASA 1036529. The custom ferrules were drilled using a GSFC designed flower pattern, and the drilling was conducted at GSFC and at Diamond Switzerland. The cable was made of seven flexlite cabled optical fiber strands wrapped and twisted around a PTFA buffer strand approximately the same outer diameter of the flexlite cable. The entire set of 7 flexlite cables are upjacketed with Gore Tex wrap followed by a PFA outer jacket. The first version of the cable bundle (FON 1416) consisted of a twist of the flexlite strands 360 degrees around the middle buffer every 3 inches which resulted in a very high insertion loss ~0.4 dB/m, likely due to microbending stresses. When the bundle manufacturing design was changed to an 18 inch twist configuration, the insertion loss reduced to 0.07 dB/m. The Diamond low profile adapter was included in the design for interconnecting the bundles.Qualification testing for the fiber bundle array assemblies included cable preconditioning testing, repeatability testing of the connector interface, vibration testing, thermal cycling, gimbal life cold motion testing, routing-bending testing and radiation testing. The LR assemblies were tested in sets of three as similar to the actual application as possible. Figure 1 shows three bundle array assemblies mated together to make the complete LR set. Figure 2 shows an earlier version of a single bundle array assembly showing the configuration of the end faces in that assembly. The end face pictures in Figure 2 show how each of the ferrules was cut into a custom “flower” pattern to accommodate the outer diameter of each outer fiber and the inner fiber while limiting the amount of epoxy necessary.Figure 2: LR Fiber Optic Bundle Array Assembly with End Face Pictures.The three assemblies were cut to length per the length requirements for the space craft integration. The assemblies were terminated and verified for adequate optical performance. Prior to termination, thermal preconditioning was performed on the cable bundle and the connector ferrules were inspected for proper specification dimensions. At the time of this testing, the ferrules were still slightly out of specification since the fine tuning of the custom drilled flower pattern was still underway. However, the testing data provided here was not meant to provide absolute loss values but relative values such that reliability would be assured and delta insertion loss values could be set for system level allocations and analysis. The cable assemblies were clocked to match reference assemblies such that when two bundle array assemblies were mated together they were optimized for maximum power throughput for the majority of the fiber channels.Due to the need to monitor all channels, there are two extra fiber interfaces in the testing set up in addition to the two interconnections required for the mission. The actual application will have an open beam on the input (receiver telescope) with an open beam detector (at LOLA) on the output and both are adjustable for system optimization. All insertion loss test set ups have an additional interconnection for in lab testing. Therefore, compared to the actual implementation, the in-lab measurements over-estimate the performance insertion loss. Table 1 summarizes the lengths and insertion loss of the test sets for environmental characterization.Table 1: Laser Ranging Engineering Model Test Cable Assembly Set Lengths and ILDUT Sets Assembly ID # Length (m) Average IL for Set @ 532SET 1 LR-EM-008 1.59 2.0 dBLR-EM-004 2.87LR-EM-010 3.80SET 2 LR-EM-011 1.56 2.7 dBLR-EM-009 2.70LR-EM-007 3.722.1Thermal PreconditioningThe outer jacket of the FON 1416 was made of a PFA Teflon in which GSFC had no thermal preconditioning data. A typical profile was used where depending on the cable type and thermal environment, 60 cycles usually eliminates a majority of the cable material shrinkage such that the cable shrinks no more than 0.1%.[3] The upper thermal limit was based on the system level survival thermal requirement. Thermal requirements for the Laser Ranging fiber optic assemblies were set to -55°C to +80°C for survival, and -45°C to +70°C for operational. For operations involving motion such as in the gimbals, the lower limit is -20°C for survival and -10°C operational. Based on the requirements the thermal preconditioning limits were set to -30°C to +80°C with dwells at the lower temperature of 30 min and at the upper limit of 60 min. During the preconditioning study it was determined that even after 80 cycles, the cable PFA outer jacket would not reach the 0.1% limit. With a twist of the cables, less often at 18 inches as opposed to 3 inches, the flexlite cables would not move the same amount length-wise during cable flexing. The 3 inch twist had provided a sturdy design for flexing, although highly lossy. Therefore, the problems that arose during the termination and testing procedures led us to believe that the bundle jacketing needed to be removed from the terminated connector.2.2Repeatability TestingThe team had concerns for integration since there would be many opportunities for the hardware to be mated and unmated. Several times during the development, mating repeatability tests were conducted to gather data on the insertion loss changes expected as well as assure that the system would perform reliably during integration. The assemblies were inserted into an adapter such that the two mated and the insertion loss was monitored over thirty matings. During testing of the brass low profile adapters, the losses never exceeded 0.15 dB provided the adapters were cleaned once when the fit became too tight.2.3Vibration TestingRandom vibration testing was conducted on two LR sets that consisted of three assemblies each. Three axis testing was conducted using three different profiles for 3 minutes per axis. The vibration profile used for each test is contained in Table 2.[4]Table 2; Random Vibration Acceleration Spectral Density (ASD) Levels VS. FrequencyTest 1: ASD levels Test 2 ASD levelsTest 3 ASD levelsFrequency Range (Hz) 20 .052 g 2/Hz .026 g 2/Hz .013 g 2/Hz20-50 +6 dB/Octave +6 dB/Octave +6 dB/Octave 50-800 .32 g 2/Hz .16 g 2/Hz .08 g 2/Hz800-2000 -6 dB/Octave -6 dB/Octave -6 dB/Octave 2000 .052 g 2/Hz .026 g 2/Hz .013 g 2/HzOverall 20 grms 14.1 grms 10 grmssystem level integration through launch. During testing the cable assembly set is monitored in-situ at 850 nm to eliminate system noise from the test results and to monitor for any damage as it occurs. The other before and after testing is performed at 532 nm on a separate test set up and does include noise related to changing testing set ups and end faces are inspected post testing to ensure no cracking of the fiber had occurred. Figure 3 is a picture of the testing set up with in-situ monitoring.Vibration Control Program Labview Data Acquisition ProgramSplitter 850nm Vibration Fixture DUT Agilent 8166A Multi-Channel Power MetersFigure 3: Random Vibration Testing Set up For the Laser Ranging Optical Fiber AssembliesAt the input an 850 nm LED was connected to a splitter so that it could be injected into the seven fiber array assembly set with one channel of the splitter connected to the detector. The optical source is monitored for power transients such that those variations can be subtracted from the end result test data. The reference fan out assemblies used here have on one side an array optical fiber connector and on the other side seven individual FC optical fiber connectors. These fan out assemblies that take individual channels and couple them into the array set (and vice versa) are used on the input and output sides of the set up. The individual channels are monitored with the HP8166 multi-channel optical power multi-meter and logged to a data file using LABVIEW software. Even though all channels were monitored the data is presented as an average for the bundle since no single channel cracked or increased insertion loss beyond acceptable limits.For the insertion loss numbers in Table 3, note that all negative numbers represent a gain and positive numbers represent an increase in loss. It is also important to remember that although 850 nm is not the operational wavelength for the integrated system, this test was primarily a test of the interconnection between each interface and those of the reference assemblies as well, and therefore the wavelength was adequate for providing delta change results in the mechanical optical interface during vibration testing. When both assembly sets were tested at 532 nm they were removed from the in-situ set up and inserted into a different insertion loss measurement system. This accounts for the difference in measurement of the change in insertion loss at 850 nm and at 532 nm. All end faces were inspected after testing was complete to validate the integrity of the fiber end faces.Table 3: Random Vibration Optical Insertion Loss Change (Δ IL ) Test ResultsVibration Level Axis Orientation SET 1 Δ IL, 850 nm (dB) SET 2 Δ IL, 850 nm (dB)SET 1Δ IL, 532 nm (dB) SET 2Δ IL, 532 nm (dB) 20grms x-axis 0.026 0.097y-axis 0.010 0.011z-axis 0.036 -0.00614grms x-axis 0.003 0.053y-axis 0.003 -0.023z-axis 0.039 -0.01910grms x-axis 0.017 -0.020y-axis 0.015 0.011z-axis 0.000 0.016Post Test 0.37 0.11Largest Loss ~ Δ 0.04 0.10Largest Loss ~ Δ in single channel 0.09 0.18The cable assemblies performed adequately during vibration testing and were then inserted into a thermal validation insertion loss experimental set up for requirements validation of the thermal levels expected for the Laser Ranging optical fiber assemblies.2.4 Thermal TestingBoth LR engineering model sets were tested for the survival thermal requirements for the Laser Ranging assemblies. The test was conducted to validate that the cable assemblies as a set would function adequately during the life of the mission and during thermal exposure. Since this type of aging testing takes weeks, the ramp rate used is accelerated over the ramp rate used for thermal workmanship testing. The output power was once again monitored in-situ at both 850 nm and 532 nm. Due to the lack of sufficient resources to monitor all 14 channels with 532 nm, 850 nm was used on the channels not measured with 532 nm to detect if any cracking of the fiber was occurring during testing. After testing was complete the assemblies were again tested in an insertion loss measurement system at 532 nm and the end faces inspected for any potential damage.Assembly Temp Monitor Assembly Data Monitor Thermal ChamberFigure 4: Thermal Cycling Testing Set-Up for the Laser Ranging Optical Fiber Bundle Array AssembliesFor this test, the thermal range was -55°C to +80°C, for 100 cycles, at 2°C/min and 30 minute dwells at the extremes. The reference fan out assemblies were used again to couple light in and out of the LR test sets. The interconnection ofboth fan out arrays to the test sets were placed inside the thermal chamber so that all interfaces were exposed to the thermal extremes. Therefore, instead of open beams at either ends of the assemblies there were two connections to other array assemblies. This of course would potentially over estimate the losses over the thermal range. For registering the relative insertion losses at 850 nm the HP8166 was used and for registering the data at 532 nm the Newport 2930C and an HP8153 with a visible detection module were used. Again the source was monitored and that left three channels for monitoring thermal induced relative insertion loss. One of the outer channels was chosen for monitoring on each assembly set. On Set 1, channel 6 was monitored and on Set 2, channel 2 and 6. During testing the thermal data was captured to compare to the registered losses over temperature. As expect the losses rose as the temperature dropped. After the thermal cycling testing for 100 cycles was complete, one set was removed from the test set up and measured while the other was tested for an additional 10 cycles but this time using a profile of -70°C to +85°C, 2°C/min ramp rate and 30 min dwell at 85°C and 60 min dwell at -70°C.The data is presented in Tables 4 through 9. The maximum relative change in optical transmittance is shown in Table 4 for Set 1 and in Table 5 for Set 2. Note that the change in transmittance included losses and gains over the thermal range. It was expected that the losses for 850 nm would be higher than for 532 nm since the fiber was not designed for the 850 nm wavelength. The data was used to simply monitor for fiber cracking over the 100 cycles.Table 4: Set 1, Maximum Relative Transmission Change in dB over 100 thermal cycles, -55°C to +80°CFiber #1 Fiber #2 Fiber #3 Fiber #4 Fiber #5 Fiber #6 Fiber #7@532nm 0.40 WL@850nm 0.78 0.70 1.07 0.96 0.99 0.53 WLTable 5: Set 2, Maximum Relative Transmission Change in dB over 100 thermal cycles, -55°C to +80°CFiber #1 Fiber #2 Fiber #3 Fiber #4 Fiber #5 Fiber #6 Fiber #7 WL @532nm 0.40 0.50WL @850nm 1.27 0.62 0.82 0.64 1.06The maximum transmission change at 532 nm was 0.5 dB for the three channels monitored. The other two channels both registered maximum transmission changes of 0.4 dB. The final relative insertion loss after the completion of the 100 cycle test, with the temperature ~ 25°C is presented in Tables 6 for Set 1 and in Table 7 for Set 2.Table 6: Set 1, Relative Insertion Loss (dB) Registered after 100 Thermal Cycles CompleteFiber #1 Fiber #2 Fiber #3 Fiber #4 Fiber #5 Fiber #6 Fiber #7@532nm 0.03 WL@850nm 0.06 -0.02 0.25 0.12 -0.05 0.007 WLTable 7: Set 2, Relative Insertion Loss Registered after 100 Thermal Cycles CompleteFiber #1 Fiber #2 Fiber #3 Fiber #4 Fiber #5 Fiber #6 Fiber #7 WL @532nm -0.22 -0.32WL @850nm -0.78 -0.27 -0.49 -0.61 -0.59In Tables 6 and 7 it shows low losses for Set 1 and small gains for the final relative insertion loss for Set 2. Therefore, both assembly sets completed the 100 cycle thermal testing to the survival limits without incident and once the temperature returned to room temperature, the losses registered either as equal or less than 0.03 dB at 532 nm.Table 8: Set 2, Maximum Relative Transmission Change in dB over 10 thermal cycles, -70°C to +85°CFiber #1 Fiber #2 Fiber #3 Fiber #4 Fiber #5 Fiber #6 Fiber #7 WL @532nm 0.40 0.55WL @850nm 1.53 1.06 0.89 0.93 1.35Once Set 1 was removed from the thermal chamber, Set 2 was tested for an additional 10 cycles and an extended thermal range. Even though the lower limit of the survival requirement is -55°C and the assemblies performed well, we were curious if the new 416 stainless steel ferrules would out perform the previously tested assemblies with 303 stainless steel ferrules.Table 9: Set 2, Relative Insertion Loss Registered (dB) After the additional 10 Thermal Cycles Completed.Fiber #1 Fiber #2 Fiber #3 Fiber #4 Fiber #5 Fiber #6 Fiber #7 WL @532nm -0.10 -0.07WL @850nm -0.12 -0.07 -0.16 -0.22 -0.15range was reduced below -60°C. This additional test of 10 cycles was validation that indeed the change in materials alleviated the risk down to -70°C. At 532 nm the transmission change is slightly larger than when the thermal range was narrower by 20 degrees. This was expected.Table 9 shows the final data for cable Set 2 after the additional 10 thermal cycles were complete. There were no cracked fibers as identified by this data. All end faces remained undamaged after testing was complete per visual inspection. The data shows slight gains per channel at both wavelengths. The slight gains registered post a thermal cycling test has been seen when testing array MTP connectors during years previous.[5,6] After the thermal testing was completed each assembly was tested in the standard laser ranging insertion loss measurement system at 532 nm. Set 2 measured 2.85 dB and was 2.81 dB after vibration and prior to thermal testing, which is a difference of 0.04 dB change for thermal testing. Set 1 was examined prior to testing in the IL measurement system and was caused to crack during measurement of how far the outer most bundle jacket had shrunk back. The jacket had shrunk nearly 2 cm and left the inner cables vulnerable to handling related failures. The fibers now lacking proper strain relief were easily cracked when the cable was bent just behind the connector during examination. The final measurement for thermal on Set 1 at 532 nm showed that the handling related crack increased the bundle insertion loss by 0.85 dB.2.5Radiation TestingThe gamma radiation test was performed on three samples at NASA GSFC’s Cobalt 60 Radiation Facility in Code 561. Two cable samples were exposed at room temperature at different dose rates (low and high) and one at cold temperature at the low dose rate. Table 10 shows the three conditions.Table 10: Radiation Induced Insertion Loss Test Conditions for Three 9.5 meter SamplesCondition for Radiation Testing Dose Rate Temperature Total DoseCondition 1 152 rads/min 24°C 1.1 MradCondition 2 18.2 rads/min 24°C 130 KradCondition 3 18.2 rads/min -50°C 130 KradThree flexlite cable (not in the seven fiber bundle) spools of 9.5 meters each were put into the testing set up such that only the spools would be exposed and the lead in and lead out cables would not. The radiation dose was monitored as was the optical fiber transmission through each of the fibers under test. Data was logged once per minute and the source was also monitored once per minute. The power level was reduced to below 1 microwatt per channel at the input to the fiber under test at 532 nm. The same Newport detectors and HP power meter were used to monitor the fibers and the source. All measurements were compared to the fiber transmission just prior to the radiation exposure began. Once the testing was completed, all the data was analyzed to provide an extrapolation model. In previous publications that included radiation data, we presented an enhanced version of the Friebele model for usage over various temperatures. The extrapolation equation takes the form ofA(D)=C0ф1-f D f (1)where A(D) is the radiation induced attenuation, D is the total dose, ф is the dose rate, C0 is a constant and f is a constant less than one.[6]. Based upon the model equation (1) no general model can be derived without making some assumptions about the constants C0 and f. Two sets of data are necessary to determine which C0 and f are approximate to use the equation for extrapolation to other dose rates at different temperatures. Under the assumption that f is a linear function of temperature T and C0 is a linear function of dose rate ф, the general model for other dose rates and other temperatures can be generated using all three data sets. Solving for f(T) using the two data sets of condition 2 and condition 3 (two different temperatures using the same dose rate) the expression isf(T) = - 2.7027 * 10-4T + 0.6565 (2)Solving for C0(ф) using two data sets of high and low dose rate tests on Figures 1 and 3, the expression isC0(ф) = -7.27 * 10-7ф + 1.4887 * 10-4 (3)In equation (3), as the dose rate becomes very small or less than 1 rad/min which is typical of space flight background radiation, C0 becomes 1.4887*10-4, independent of dose rate. Under this assumption that most space flight environmentshave background radiation at levels less than 1 rad/min, the expression for radiation-induced attenuation (dB) at room temperature of 24°C can be extrapolated with:A(D) = 1.4887 * 10-4ф0.35 D 0.65 (4)To scale the units to dB/m, A(D)/L with L equal to 9.5 m which is the actual length of the cables under test, Figure 5 shows the extrapolation curve given a dose rate of 1 rad/min, up to a total dose of 200 Krads for room temperature of 24°C and for -50°C.00.20.40.60.81 1.21.4 1.6 1.82x 10500.010.020.030.040.050.06Total Dose (rads)R a d i a t i o n I n d u c e d A t t e n u a t i o n (d B /m )Extrapolation Radiation Induced Attenuation at 1 rad/min up to 200 kradsFigure 5: Extrapolation Curve at the dose rate of 1 rad/min up to 200 Krads at temperature of 24˚C & -50˚CTo compare the difference that lowering the temperature makes to the extrapolation results, if we use -50˚C to calculate the radiation induced attenuation at 200 Krads, the result is 0.056 dB/m. For the same conditions using the graph in Figure 5 the radiation induced loss at 200 Krads and 24°C is slightly lower at 0.044 dB/m as can be seen in Figure 5. This extrapolation model allowed us to set realistic allocations for the radiation induced loss over a range of worst case cold temperatures and shielding conditions across the LRO space craft. The bundle materials that surround the flexlite cable in the actual application do provide some shielding but does not significantly affect the radiation induced attention.2.6 Gimbal Motion TestOne of the earliest concerns for the LR mission was if the optical fiber bundle could survive being flexed in the set of gimbals while being at a cold temperature. The first test summarized in reference 7 was conducted with a single strand of flexlite cable not in the 7 fiber bundle. When the new bundle arrived with the 3 inch twist it was terminated to individual FC optical fiber connectors at each end and was inserted into the gimbal and routed so that it would be included into two cable wraps to simulate the routing of the cable through a two gimbal system. Due to the size of the gimbals, using this method was the only way to fit the gimbal system into a thermal chamber such that it could be maintained at a temperature of -20°C and -10°C while in motion. The objectives for this test were to 1) ensure that the cable would not break during the life of the mission and 2) to identify the expected changes transmission during the flexing of the gimbals at cold temperature where we expected the largest amount of loss to occur. Therefore the test wasconducted in-situ at 850 nm and 532 nm. Testing on the bench outside of the thermal chamber and inside the thermal chamber at cold temperatures showed no difference between the change in transmittance for the 7 fiber bundle. Therefore most of the channels were monitored at 850 nm with measurements at 532 nm made weekly on all channels. The mission requirement for the gimbal motion was approximately 5500 mechanical cycles, in which going from the zero position to the position at 180° and back constituted a complete cycle. The gimbal cycling test at cold temperature was conducted for nearly 17,000 mechanical cycles without incident. In addition the gimbal system with the fiber bundle endured another 2000 cycles at room temperature. The optical transmittance change was higher than expected for the bundle but the bundle with the 3 inch twist was under a great deal of stress due to the design as demonstrated by the un bent insertion loss of the optical fiber at 532 nm. The test will be repeated with the 18 inch twist bundle in the next few months and will be published at a later time. The pictures in Figure 6 a and b show the front of the gimbal with cable in the cable wrap and the back view of the gimbal where the cable was looped to go back down the shaft to be wrapped into another cable wrap. Where Figure 6 a shows the gimbal in the tight position or position at zero degrees, Figure 7 a shows the gimbal in the loose position or at position 180 degrees.Figures 6 a) front view of the cable inside of gimbal in tight configuration, b) back view of gimbal.Figure 7a) Gimbal in loose position, b)Inside the Thermal Chamber, Gimbal with Fiber Bundle Integrated.The bundle was monitored through out the gimbal motion cold test and in Table 11 the parameters of the lifetest are listed. Since both cable wraps were fitted in the tight position, both gimbals were tight at zero degrees position and both were loose at 180 degrees gimbal position. This showed the worst case scenario for the optical fiber relative insertion loss with both cable wraps tight and loose at the same time.Table 11: Parameters of the Mechanical Gimbal Life Testing at Cold Temperatures.CyclesGimbalTemperature Wavelength Mechanicalnm 223-20°C 532nm 6301-20°C 850nm 10497-10°C 850。