1. Work Breakdown Structure
Three Point Estimate
effort of Activity = ((Most probable Value * 4) + Pessimistic Value + Optimistic Value)/6
2. Lines of Code
Measure after the coding phase
Measure at the beginning of maintenance or conversion project
3. Delphi
several engineers individually produce estimates and converge on a consensus estimate at last.
3.1 A group of experts is each given the program's SPECs and estimation form
3.2 the experts discuss
3.3 anonymously list project tasks and a size estimate
3.4 moderator collects the estimates and tabulates the results and return them to the experts
3.5 only each expert's personal estimate is identified; all others are anonymous.
3.6 experts discuss and review the tasks
3.7 continues from step 3.3 again utill converge to within an acceptable range
4. Function Point
Five aspects considered:
* External Inputs(EIs), Add&Update&Delete Records
* External External Outputs(EOs), Records Summary&Details
* External Inquiries(EQ),an input/output flow where input immediately generates the output, no logical data files modified
* Internal logical files(ILFs),maintained data model entity
* External Interface files(EIFs), maintained outside of the application
step1: determine the type of project: development project, enhancement project,installed application
step2: identify the Counting Boundary
step3: determine an unadjusted function point count
step4: Determine a value adjustment factor
step5: Calculate the adjusted function point count
分享到:
相关推荐
In this work, we pro- pose a multitask framework for jointly 2D and 3D pose estimation from still images and human action recogni- tion from video sequences. We show that a single archi- tecture can ...
To work with practical systems, several modifications need to be made to the compressive sensing framework as the channel estimation error varies with how detailed the channel is modeled, and how ...
Learning-based methods are believed to work well for unconstrained gaze estimation, i.e. gaze estimation from a monocular RGB camera without assumptions regarding user, environment, or camera. However...
He is well known for his work on the theory and application of nonparametric curve estimation and is the author of Nonparametric Curve Estimation: Methods, Theory, and Applications. Professor Sam ...
The semantic cues work well in both unsupervised and supervised manners. SegStereo achieves stateof-the-art results on KITTI Stereo benchmark and produces decent prediction on both CityScapes and ...
work on 300W-LP, a large synthetically expanded dataset, to predict intrinsic Euler angles (yaw, pitch and roll) di- rectly from image intensities through joint binned pose clas- sification and ...
Previous work focused on the estimation of the average avail-bw, ignoring the significant variability of this metric in different time scales. In this paper, we show how to estimate a given ...
In this work we propose a clustering method based on the expectation maximization algorithm that adapts on-line the number of components of a finite Gaussian mixture model from multivariate data. Or ...
In this work, we devise an adaptive decision scheme with range estimation capabilities for point-like targets in partially homogeneous environments. To this end, we exploit the spillover of target ...
• Computation times have to be so small that the system can work in situation with time constraints. • The system has to accommodate for the user’s greater comprehension of a scene. This includes ...
projects, little work has been done for estimation in agile projects, especially estimating user stories or issues. Story points are the most common unit of measure used for estimating the effort ...
The contribution of this work lies in its proposal of a more robust and efficient DOA estimation approach, addressing the shortcomings of traditional algorithms like MUSIC and MMUSIC. By enhancing ...
In this work we develop tools to obtain practical uncertainty estimates in deep learning, casting recent deep learning tools as Bayesian models without changing either the models or the optimisation. ...
was especially written to work with binary or ordinal regression models, Cox regression, accelerated failure time models, ordinary linear models,the Buckley-James model, generalized least squares for ...
Future work may explore combining multiple techniques to create more advanced dehazing algorithms, potentially resulting in superior performance. 3. **Scene-specific Dehazing**: Hazy images vary in ...
Bayesian analysis of data in the health,social and physical sciences has been greatly facilitated in the last decade by advances in computing power and improved scope for estimation via iterative ...